Summary: Researchers discover “special” brain activities in both mother and child with ASD when gazing at one another.
Source: Kanazawa University.
Special activities in the brains while mother and her child with autism spectrum disorder are gazing each other have been discovered by the research team of Kanazawa University together with the research team of Osaka University. The teams used a very special equipment based on magnetoencephalography (MEG) to make three scientifically important discoveries.
Brain activities upon gazing each other are low in the case of child with serious autism spectrum disorder.
While the activities in child’s brain with autism spectrum disorder are low, the activities in mother’s brain are also low.
When mother makes a movement such as nodding in response to her child, the activities in mother’s brain are high.
This study was able to be performed thanks to the equipment development. MEG can detect electrical activities of the brain in a non-invasive and hence non-harmful manner, by the utilization of superconducting sensor technology, with very high time and space resolutions. The researchers have developed an MEG for a child (only one copy in Japan). Furthermore, a very special equipment has been constructed, based on the MEG for a child as above, which can be used for an adult and a child simultaneously, the equipment available only in Kanazawa in the world.
While a parent and a child are gazing each other, enormous amounts of social information are exchanged in an unconscious manner. That is, reading of the face allows new emotion to emerge, which appears on your own face and affects the other; these interactions continue without cease. Bidirectional interactions are thought to play important roles in the development of child’s sociality.
Simultaneous measurements of brain activities while mother and child are gazing each other by the current study are expected to be a big step to elucidation of development of child’s social brain.
About this genetics research article
Funding: Study funded by Ministry of Education, Culture, Sports, Science and Technology of Japan, Japan Society for the Promotion of Science, Japan Science and Technology Agency.
Source: Takashi Shimizu – Kanazawa University Image Source: NeuroscienceNews.com image is credited to Kanazawa University. Original Research: Full open access research for “Mu rhythm suppression reflects mother-child face-to-face interactions: a pilot study with simultaneous MEG recording” by Chiaki Hasegawa, Takashi Ikeda, Yuko Yoshimura, Hirotoshi Hiraishi, Tetsuya Takahashi, Naoki Furutani, Norio Hayashi, Yoshio Minabe, Masayuki Hirata, Minoru Asada & Mitsuru Kikuchi in Scientific Reports. Published online October 10 2016 doi:10.1038/srep34977
Cite This NeuroscienceNews.com Article
[cbtabs][cbtab title=”MLA”]Kanazawa University “Special Brain Activities While Mom and Child With Autism Gaze at Each Other.” NeuroscienceNews. NeuroscienceNews, 15 November 2016. <https://neurosciencenews.com/parent-gaze-asd-5517/>.[/cbtab][cbtab title=”APA”]Kanazawa University (2016, November 15). Special Brain Activities While Mom and Child With Autism Gaze at Each Other. NeuroscienceNew. Retrieved November 15, 2016 from https://neurosciencenews.com/parent-gaze-asd-5517/[/cbtab][cbtab title=”Chicago”]Kanazawa University “Special Brain Activities While Mom and Child With Autism Gaze at Each Other.” https://neurosciencenews.com/parent-gaze-asd-5517/ (accessed November 15, 2016).[/cbtab][/cbtabs]
Mu rhythm suppression reflects mother-child face-to-face interactions: a pilot study with simultaneous MEG recording
The belief in physiognomy—the art of reading character from faces—has been with us for centuries. People everywhere infer traits (for example, trustworthiness) from faces, and these inferences predict economic, legal and even voting decisions. Research has identified many configurations of facial features that predict specific trait inferences, and detailed computational models of such inferences have recently been developed. However, these configurations do not fully account for trait inferences from faces. Here, we propose a new direction in the study of inferences from faces, inspired by a cognitive–ecological and implicit-learning approach. Any face can be positioned in a statistical distribution of faces extracted from the environment. We argue that understanding inferences from faces requires consideration of the statistical position of the faces in this learned distribution. Four experiments show that the mere statistical position of faces imbues them with social meaning: faces are evaluated more negatively the more they deviate from a learned central tendency. Our findings open new possibilities for the study of face evaluation, providing a potential model for explaining both individual and cross-cultural variation, as individuals are immersed in varying environments that contain different distributions of facial features.
“Mu rhythm suppression reflects mother-child face-to-face interactions: a pilot study with simultaneous MEG recording” by Chiaki Hasegawa, Takashi Ikeda, Yuko Yoshimura, Hirotoshi Hiraishi, Tetsuya Takahashi, Naoki Furutani, Norio Hayashi, Yoshio Minabe, Masayuki Hirata, Minoru Asada & Mitsuru Kikuchi in Scientific Reports. Published online October 10 2016 doi:10.1038/srep34977