Artificial intelligence reveals how the brain programs procedural knowledge.
High resolution imaging reveals the human cerebellum is 80% of the area of the cortex. The findings indicate this area of the brain likely grew larger as human behavior and cognition evolved.
Researchers have developed a new technique that uses EEG data to reconstruct images based on how we perceive faces.
Researchers report they can use brain activation patterns to identify complex thoughts. Their findings suggest the building blocks for complex human thoughts are not word based, but formed by the brain's sub systems. The study provides evidence that the neural dimensions of concept representation are universal across people and languages.
Researchers use the TV show 'Sherlock' to investigate how the brain segments experiences during perception and long-term memory formation.
Both the default mode network and salience network in superagers had stronger connectivity than typical older adults and similar connectivity as younger adults. Superagers performed similarly to young adults and better than typical older adults in recognition and episodic memory tasks.
A new neuroimaging study reports people born with one hand, who are able to use both limbs in a two handed task, show brain activity similar to that of people with two hands.
Researchers explore how some people can feel a physical sensation when they witness another person being touched.
3D images show how a baby's brain and skull change shape during labor and delivery.
Adults who played Pokemon video games as children had preferential activation in the visual system for Pokemon character, researchers report. The finding shed light on the development of the visual system and categorization in the brain.
Activity in the frontoparietal network during memory tasks reflected the individual working memory capabilities of children, with an activity pattern unique to working memory.
EmoNet, a new convolutional neural network, can accurately decode images into eleven distinct emotional categories. Training the AI on over 25,000 images, researchers demonstrate image content is sufficient to predict the category and valence of human emotions.