Visual scanpaths during memory retrieval tasks were associated with the quality of the memory. Researchers say the replay of a sequence of eye movements helps boost memory reconstruction.
EyeSyn, a newly developed "virtual eye" that simulates how humans look at the world accurately enough for the development of new augmented reality programs, can help create applications for the metaverse.
A newly developed AI algorithm can directly predict eye position and movement during an MRI scan. The technology could provide new diagnostics for neurological disorders that manifest in changes in eye-movement patterns.
Adolescents and older adults pay less attention to social cues in real-world interactions than young adults.
A new digital app has shown to be successful in detecting one key symptom associated with ASD in young children. The app, which combines gaze tracking and machine learning algorithms, could be an inexpensive new tool to help with the diagnosis of autism.
A new study reveals the relationship between attentional state and emotions from pupillary reactions. Visual perception elicits emotions in all attentional state, while auditory perception elicits emotions only when attention is paid to sounds.
Using scenes from movies, researchers discover how different brain areas can be used flexibly and as needed. The study sheds light on how the brain transitions between moral thinking and empathy.
Eye movements distinguish between "go" and "no go" actions early in the decision-making process, before the hand even starts to move.
Study reveals two different brain structures are implicated in implicit and explicit theory of mind, and both regions mature at different ages to fulfill their function. The supramarginal gyrus matures earlier, enabling theory of mind to occur slightly earlier than believed. Full ability for theory of mind occurs at age four when the temporoparietal junction matures.