Results from a new artificial intelligence study indicate number sense is spontaneously created by the visual system, without prior experience of counting.
EmoNet, a new convolutional neural network, can accurately decode images into eleven distinct emotional categories. Training the AI on over 25,000 images, researchers demonstrate image content is sufficient to predict the category and valence of human emotions.
A new map of the octopus visual system classifies different types of neurons in a part of the brain dedicated to vision, shedding new light on the evolution of the brain and visual systems in a more broad sense.
Researchers propose a new theory of what happens in the brain when we experience familiar seeming visual stimuli. The theory, dubbed sensory referenced suppression, suggests the brain understands different levels of activation expected for sensory input and corrects for it, leaving behind the signal for familiarity.
The visual system adapts to the loss of photoreception by increasing sensitivity but simultaneously becomes deleteriously hyperactive. The findings could lead to new therapies to protect vision or reverse vision loss.
Researchers report a new EEG system is capable of capturing more information from the visual cortex than previous versions of the same system.
Current deep learning models are able to create images strongly enough to activate specific neurons in the visual cortex. However, researchers say more accurate artificial neural network models should be developed to help produce more accurate control.
A new study from Harvard researchers reveals that in primates, the blueprint for adult visual organization is in place a few days after birth and 'fills in' with experience. Researchers believe that, as the organization is present so early on in life, it is likely present before birth as a genetically programmed mechanism, and later modified due to experience.