Neuroscience News logo for mobile.
      Neuroscience News Logo

      3D Tongue Movement Images Help Patients Improve Speech

      FeaturedNeurology
      ·November 18, 2015

      Findings could be helpful for stroke patients.

      A new study done by University of Texas at Dallas researchers indicates that watching 3-D images of tongue movements can help individuals learn speech sounds.

      According to Dr. William Katz, co-author of the study and professor at UT Dallas’ Callier Center for Communication Disorders, the findings could be especially helpful for stroke patients seeking to improve their speech articulation.

      “These results show that individuals can be taught consonant sounds in part by watching 3-D tongue images,” said Katz, who teaches in the UT Dallas School of Behavioral and Brain Sciences. “But we also are seeking to use visual feedback to get at the underlying nature of apraxia and other related disorders.”

      The study, which appears in the journal Frontiers in Human Neuroscience, was small but showed that participants became more accurate in learning new sounds when they were exposed to visual feedback training.

      Katz is one of the first researchers to suggest that the visual feedback on tongue movements could help stroke patients recover speech.

      “People with apraxia of speech can have trouble with this process. They typically know what they want to say but have difficulty getting their speech plans to the muscle system, causing sounds to come out wrong,” Katz said.

      “My original inspiration was to show patients their tongues, which would clearly show where sounds should and should not be articulated,” he said.

      Technology recently allowed researchers to switch from 2-D technology to the Opti-Speech technology, which shows the 3-D images of the tongue. A previous UT Dallas research project determined that the Opti-Speech visual feedback system can reliably provide real-time feedback for speech learning.

      Part of the new study looked at an effect called compensatory articulation — when acoustics are rapidly shifted and subjects think they are making a certain sound with their mouths, but hear feedback that indicates they are making a different sound.

      Katz said people will instantaneously shift away from the direction that the sound has pushed them. Then, if the shift is turned off, they’ll overshoot.

      Image of the 3D tongue and a photo of a woman.
      Technology recently allowed researchers to switch from 2-D to the Opti-Speech technology, which shows the 3-D images of the tongue. Credit: UT Dallas.

      “In our paradigm, we were able to visually shift people. Their tongues were making one sound but, little by little, we start shifting it,” Katz said. “People changed their sounds to match the tongue image.”

      Katz said the research results highlight the importance of body visualization as part of rehabilitation therapy, saying there is much more work to be done.

      “We want to determine why visual feedback affects speech,” Katz said. “How much is due to compensating, versus mirroring (or entrainment)? Do some of the results come from people visually guiding their tongue to the right place, then having their sense of ‘mouth feel’ take over? What parts of the brain are likely involved?

      “3-D imaging is opening an entirely new path for speech rehabilitation. Hopefully this work can be translated soon to help patients who desperately want to speak better.”

      About this neuroscience research

      Funding: The Opti-Speech study was co-authored by Sonya Mehta, a doctoral student in Communication Sciences and Disorders, and was funded by the UT Dallas Office of Sponsored Projects, the Callier Center Excellence in Education Fund, and a grant awarded by the National Institute on Deafness and Other Communication Disorders.

      Source: Phil Roth – UT Dallas
      Image Source: The image is credited to UT Dallas.
      Original Research: Abstract for “Visual feedback of tongue movement for novel speech sound learning” by William F. Katz and Sonya Mehta in Frontiers in Human Neuroscience. Published online November 17 2015 doi:10.3389/fnhum.2015.00612


      Abstract

      Visual feedback of tongue movement for novel speech sound learning

      Pronunciation training studies have yielded important information concerning the processing of audiovisual (AV) information. Second language (L2) learners show increased reliance on bottom-up, multimodal input for speech perception (compared to monolingual individuals). However, little is known about the role of viewing one’s own speech articulation processes during speech training. The current study investigated whether real-time, visual feedback for tongue movement can improve a speaker’s learning of non-native speech sounds. An interactive 3D tongue visualization system based on electromagnetic articulography (EMA) was used in a speech training experiment. Native speakers of American English produced a novel speech sound (/ɖ̠/; a voiced, coronal, palatal stop) before, during, and after trials in which they viewed their own speech movements using the 3D model. Talkers’ productions were evaluated using kinematic (tongue-tip spatial positioning) and acoustic (burst spectra) measures. The results indicated a rapid gain in accuracy associated with visual feedback training. The findings are discussed with respect to neural models for multimodal speech processing.

      “Visual feedback of tongue movement for novel speech sound learning” by William F. Katz and Sonya Mehta in Frontiers in Human Neuroscience. Published online November 17 2015 doi:10.3389/fnhum.2015.00612

      Feel free to share this Neuroscience News.
      Join our Newsletter
      Thank you for subscribing.
      Something went wrong.
      I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
      Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
      We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.
      Tags
      apraxiaaudiovisual processinglearningNeuroscienceOpti-Speechspeechspeech processingvisual feedback
      ShareTweetShareShareSubmitEmail
      Neuroscience News
      Neuroscience News posts science research news from labs, universities, hospitals and news departments around the world. Science articles cover neuroscience, psychology, AI, robotics, neurology, brain cancer, mental health, machine learning, autism, Parkinson's, Alzheimer's, brain research, depression and other sciences.
      Neuroscience News Footer Logo
      • Facebook
      • Twitter
      • Instagram
      • YouTube
      • Linkedin

      Neuroscience News Sitemap
      Neuroscience Graduate and Undergraduate Programs
      Free Neuroscience MOOCs
      About
      Contact Us
      Privacy Policy
      Submit Neuroscience News
      Subscribe for Emails

      Neuroscience Research
      Psychology News
      Brain Cancer Research
      Alzheimer’s Disease
      Parkinson’s News
      Autism / ASD News
      Neurotechnology News
      Artificial Intelligence News
      Robotics News

      Search Neuroscience News

      Neuroscience News is an online science magazine offering free to read research articles about neuroscience, neurology, psychology, artificial intelligence, neurotechnology, robotics, deep learning, neurosurgery, mental health and more.

      Neuroscience News
      • Neuroscience
        • Featured
        • Neuroscience Videos
        • Neuro Web Stories
        • Open Access Neuroscience
        • Electrophysiology
        • Genetics
        • Neuroscience Programs
      • Neurology
        • Alzheimer’s Disease
        • Brain Research
        • Brain Cancer
        • Autism
        • Epilepsy
        • Traumatic Brain Injuries
        • Parkinson’s Disease
      • Psychology
        • Schizophrenia
        • Depression
        • Bipolar Disorder
        • Mental Health
      • AI
        • Neural Networks
        • Deep Learning
        • Machine Learning
      • Robotics
      • Neurotech
        • Brain Computer Interfaces
        • Neuroprosthetics
      • About
        • Neuroscience Newsletters
        • Submit Neuroscience News
        • Privacy Policy
        • Neuroscience News Sitemap
        • Contact Neuroscience News
        • Advertise on Neuroscience News
      Neuroscience News Small Logo
      • Neuroscience
        • Featured
        • Neuroscience Videos
        • Neuro Web Stories
        • Open Access Neuroscience
        • Electrophysiology
        • Genetics
        • Neuroscience Programs
      • Neurology
        • Alzheimer’s Disease
        • Brain Research
        • Brain Cancer
        • Autism
        • Epilepsy
        • Traumatic Brain Injuries
        • Parkinson’s Disease
      • Psychology
        • Schizophrenia
        • Depression
        • Bipolar Disorder
        • Mental Health
      • AI
        • Neural Networks
        • Deep Learning
        • Machine Learning
      • Robotics
      • Neurotech
        • Brain Computer Interfaces
        • Neuroprosthetics
      • About
        • Neuroscience Newsletters
        • Submit Neuroscience News
        • Privacy Policy
        • Neuroscience News Sitemap
        • Contact Neuroscience News
        • Advertise on Neuroscience News
      Neuroscience News LogoNeuroscience News
      This shows an older man.

      Chronic Stress and Depression Linked to Alzheimer’s Risk

      This shows two young children playing with blocks.

      Playful Brains: Early Years Play Shapes Children’s Futures

      This shows a woman sleeping.

      Sleep Struggles Link to Elevated Risk of Hypertension

      This shows a woman talking to her therapist.

      MDMA Makes People Feel More Connected During Conversations

      Start typing to see results or hit ESC to close
      Neuroscience neurobiology brain research Psychology Neurology
      See all results

      Subscribe

      Neuroscience News Daily Emails
      Go to Appearance > Customize > Subscribe Pop-up to set this up.