A newly developed AI algorithm can directly predict eye position and movement during an MRI scan. The technology could provide new diagnostics for neurological disorders that manifest in changes in eye-movement patterns.
In socially awkward situations when a person is caught staring and averts their eyes, a third-party observer does not reflexively follow their gaze. The brain tells the observer there is no significance to the location where the embarrassed person has turned their attention.
The initial reaction of the brain is independent of the facial emotional expression we see. It is only after the eye movement is completed that the brain shows strong responses to the emotional expression of a face.
Artificial IntelligenceDeep LearningFeaturedMachine LearningNeuroscienceNeurotechOpen Neuroscience ArticlesVisual Neuroscience··3 min read
Combining neuroimaging data with deep convolutional neural networks, researchers were able to predict where people would direct their attention and gaze at images of natural scenes.
A new eye tracking study reveals skilled musicians only read musical notes slightly faster than novices, but during that time, professional musicians are able to add flourish and play around with the music, interpreting it in their own manner.
An eye tracking technique that measures small involuntary eye movements may provide a new method for monitoring temporal expectations in people with ADHD, a new study reports.
Even during early stages of the disease, gut bacteria in those with Parkinson's differs significantly from those without the disease, a new study reports.
A simple eye test may be a useful tool in helping to diagnose ASD, a new study reports. Researchers measured eye movement in those on the autism spectrum and found they continually missed a specific target. The researchers suggest sensory motor control in the cerebellum that is usually responsible for eye control could be impaired in those with ASD.