Jump-Starting Vision With the Power of Words

Cognitive scientists have come to view the brain as a prediction machine, constantly comparing what is happening around us to expectations based on experience — and considering what should happen next.

“These predictions, most of them unconscious, include predicting what we’re about to see,” says Gary Lupyan, a University of Wisconsin-Madison psychology professor.

Work in Lupyan‘s lab has demonstrated the predictive process through manipulating the connection between language and vision in the brain. A study published recently in the Journal of Neuroscience shows that words have a profound effect even on the first electrical twitches of perception.

Lupyan’s collaborator, Bastien Boutonnet of Leiden University in the Netherlands, showed people dozens of pictures from a group of 10 common objects or animals. Before each picture appeared, participants first heard a word (“dog,” for instance) or a nonverbal sound (such as a dog barking). The study subjects simply decided whether the word or sound matched the image.

In an earlier study, Lupyan showed that people are faster to recognize pictures after hearing words than after hearing highly familiar nonverbal sounds. The new study by Boutonnet and Lupyan may have revealed the mechanism behind this “label advantage.”

Their study subjects wore nets of electrodes on their scalps, recording electrical activity in their brains as they viewed the images. The results showed that hearing words versus non-word sounds made a significant difference in a well-known peak in brain activity occurring within one-tenth of a second after the eyes fall on an image.

The study, supported by the National Science Foundation, is the first to show that a word cue — or a cue of any kind — has such a basic effect on the way the brain processes visual information.

This image shows the visual system in the brain.
Their study subjects wore nets of electrodes on their scalps, recording electrical activity in their brains as they viewed the images. The results showed that hearing words versus non-word sounds made a significant difference in a well-known peak in brain activity occurring within one-tenth of a second after the eyes fall on an image. Image is for illustrative purposes only. Image credit: Polina Tishina.

“Even in that first hundred milliseconds of the earliest stages of visual processing, half a second before they respond, you can see language shaping perceptual mechanisms to make more effective predictions of what is about to occur,” Lupyan says.

The results leave a puzzle: Nonverbal sounds like a bark can carry quite a bit more information than what is carried by a simple word like “dog.”

“The volume and pitch of the bark can tell you not only that there’s a dog nearby, but the size of the dog and whether it’s angry or frightened or playful, right?” Lupyan says.

So why are labels better than nonverbal sounds in helping us recognize what is to come?

In a study recently published in the journal Cognition, Lupyan and Pierce Edmiston, a graduate student at UW-Madison, write that all those specifics, while handy, are not necessarily relevant to the fundamental dog-ness of dogs in general. The word “dog” cuts through to the most simple and general information necessary for recognizing a picture as a dog.

“Words are ideal for activating categories in the mind,” says Lupyan. “Think about a guitar — you can say ‘guitar,’ leaving unspecified whether it’s an electric or acoustic kind. But if you were to see or hear a guitar, you would know immediately whether it’s an electric or an acoustic one. Our senses cannot be ambiguous about this.”

The only way to meaningfully convey the general notion of a guitar may be by using language.

“Language allows us this uniquely human way of thinking in generalities,” Lupyan says. “This ability to transcend the specifics and think about the general may be critically important to logic, mathematics, science, and even complex social interactions.”

About this vision and neuroscience research

Source: Chris Barncard – University of Wisconsin-Madison
Image Credit: The image is credited to Polina Tishina and is in the public domain
Original Research: Abstract for “Words Jump-Start Vision: A Label Advantage in Object Recognitio” by Bastien Boutonnet and Gary Lupyan in Journal of Neuroscience. Published online June 24 doi:10.1523/JNEUROSCI.5111-14.2015


Abstract

Words Jump-Start Vision: A Label Advantage in Object Recognition

People use language to shape each other’s behavior in highly flexible ways. Effects of language are often assumed to be “high-level” in that, whereas language clearly influences reasoning, decision making, and memory, it does not influence low-level visual processes. Here, we test the prediction that words are able to provide top-down guidance at the very earliest stages of visual processing by acting as powerful categorical cues. We investigated whether visual processing of images of familiar animals and artifacts was enhanced after hearing their name (e.g., “dog”) compared with hearing an equally familiar and unambiguous nonverbal sound (e.g., a dog bark) in 14 English monolingual speakers. Because the relationship between words and their referents is categorical, we expected words to deploy more effective categorical templates, allowing for more rapid visual recognition. By recording EEGs, we were able to determine whether this label advantage stemmed from changes to early visual processing or later semantic decision processes. The results showed that hearing a word affected early visual processes and that this modulation was specific to the named category. An analysis of ERPs showed that the P1 was larger when people were cued by labels compared with equally informative nonverbal cues—an enhancement occurring within 100 ms of image onset, which also predicted behavioral responses occurring almost 500 ms later. Hearing labels modulated the P1 such that it distinguished between target and nontarget images, showing that words rapidly guide early visual processing.

“Words Jump-Start Vision: A Label Advantage in Object Recognitio” by Bastien Boutonnet and Gary Lupyan in Journal of Neuroscience. Published online June 24 doi:10.1523/JNEUROSCI.5111-14.2015

Feel free to share this neuroscience news.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.