Summary: A new study reveals the relationship between attentional state and emotions from pupillary reactions. Visual perception elicits emotions in all attentional state, while auditory perception elicits emotions only when attention is paid to sounds.
A research team from the Department of Computer Science and Engineering and the Electronics-Inspired Interdisciplinary Research Institute at Toyohashi University of Technology has indicated that the relationship between attentional states in response to pictures and sounds and the emotions elicited by them may be different in visual perception and auditory perception. This result was obtained by measuring pupillary reactions related to human emotions. It suggests that visual perception elicits emotions in all attentional states, whereas auditory perception elicits emotions only when attention is paid to sounds, thus showing the differences in the relationships between attentional states and emotions in response to visual and auditory stimuli.
In our daily lives, our emotions are often elicited by the information we receive from visual and auditory perception. As such, many studies up until now have investigated human emotional processing using emotional stimuli such as pictures and sounds. However, it was not clear whether such emotional processing differed between visual and auditory perception.
Our research team asked participants in the experiment to perform four tasks to alert them to various attentional states when they were presented with emotionally arousing pictures and sounds, in order to investigate how emotional responses differed between visual and auditory perception. We also compared the pupillary responses obtained by eye movement measurements as a physiological indicator of emotional responses. As a result, visual perception (pictures) elicited emotions during the execution of all tasks, whereas auditory perception (sounds) did so only during the execution of tasks where attention was paid to the sounds. These results suggest that there are differences in the relationship between attentional states and emotional responses to visual and auditory stimuli.
“Traditionally, subjective questionnaires have been the most common method for assessing emotional states. However, in this study, we wanted to extract emotional states while some kind of task was being performed. We therefore focused on pupillary response, which is receiving a lot of attention as one of the biological signals that reflect cognitive states. Although many studies have reported about attentional states during emotional arousal owing to visual and auditory perception, there have been no previous studies comparing these states across senses, and this is the first attempt”, explains the lead author, Satoshi Nakakoga, Ph. D. student.
Besides, Professor Tetsuto Minami, the leader of the research team, said, “There are more opportunities to come into contact with various visual media via smartphones and other devices and to evoke emotions through that visual and auditory information. We will continue investigating about sensory perception that elicits emotions, including the effects of elicited emotions on human behavior.”
Based on the results of this research, our research team indicates the possibility of a new method of emotion regulation in which the emotional responses elicited by a certain sense are promoted or suppressed by stimuli input from another sense. Ultimately, we hope to establish this new method of emotion regulation to help treat psychiatric disorders such as panic and mood disorders.
Funding: The study was funded by the National Institutes of Health, the TUT Program in Personalized Health, the National Center for Research Resources, the National Center for Advancing Translational Sciences, the Howard Hughes Medical Institute, the W.M. Keck Foundation, and the George S. and Delores Doré Eccles Foundation.
Asymmetrical characteristics of emotional responses to pictures and sounds: Evidence from pupillometry
In daily life, our emotions are often elicited by a multimodal environment, mainly visual and auditory stimuli. Therefore, it is crucial to investigate the symmetrical characteristics of emotional responses to pictures and sounds. In this study, we aimed to elucidate the relationship of attentional states to emotional unimodal stimuli (pictures or sounds) and emotional responses by measuring the pupil diameter, which reflects the emotional arousal associated with increased sympathetic activity. Our hypothesis was that the emotional responses to both the image and sound stimuli are symmetrical: emotion might be suppressed when attentional resources are allocated to another stimulus of the same modality as the emotional stimulus—such as a dot presented at the same time as an emotional image, and a beep sound presented at the same time as an emotional sound. In our two experiments, data for 24 participants were analyzed for a pupillary response. In experiment 1, we investigated the relationship of the attentional state with emotional visual stimuli (International Affective Picture System) and emotional responses by using pupillometry. We set four task conditions to modulate the attentional state (emotional task, no task, visual detection task, and auditory detection task). We observed that the velocity of pupillary dilation was faster during the presentation of emotionally arousing pictures compared to that of neutral ones, regardless of the valence of the pictures. Importantly, this effect was not dependent on the task condition. In experiment 2, we investigated the relationship of the attentional state with emotional auditory sounds (International Affective Digitized Sounds) and emotional responses. We observed a trend towards a significant interaction between the stimulus and the task conditions with regard to the velocity of pupillary dilation. In the emotional and auditory detection tasks, the velocity of pupillary dilation was faster with positive and neutral sounds than negative sounds. However, there were no significant differences between the no task and visual detection task conditions. Taken together, the current data reveal that different pupillary responses were elicited to emotional visual and auditory stimuli, at least in the point that there is no attentional effect to emotional responses to visual stimuli, despite both experiments being sufficiently controlled to be of symmetrical experimental design.