Summary: A new digital app has shown to be successful in detecting one key symptom associated with ASD in young children. The app, which combines gaze tracking and machine learning algorithms, could be an inexpensive new tool to help with the diagnosis of autism.
Source: Duke University
A digital app successfully detected one of the telltale characteristics of autism in young children, suggesting the technology could one day become an inexpensive and scalable early screening tool, researchers at Duke University report.
The research team created the app to assess the eye gaze patterns of children while they watched short, strategically designed movies on an iPhone or iPad, then applied computer vision and machine learning to determine whether the child was looking more often at the human in the video, or objects.
“We know that babies who have autism pay attention to the environment differently and are not paying as much attention to people,” said Geraldine Dawson, Ph.D., director of the Duke Center for Autism and Brain Development, and co-senior author of a study appearing online April 26 in JAMA Pediatrics.
“We can track eye gaze patterns in toddlers to assess risk for autism,” Dawson said. “This is the first time that we’ve been able to provide this type of assessment using only a smart phone or tablet. This study served as a proof-of-concept, and we’re very encouraged.”
Dawson and colleagues — including lead author Zhuoqing Chang, Ph.D., postdoctoral associate in Duke’s Department of Electrical and Computer Engineering — began collaborating with Dawson to develop the app several years ago. In this latest version, the researchers strategically designed movies that would enable them to assess a young child’s preference for looking at objects more than at people.
One movie, for example, shows a cheerful woman playing with a top. She dominates one side of the screen while the top she is spinning is on the other side. Toddlers without autism scanned the entire screen throughout the video, focusing more often on the woman.
Toddlers who were later diagnosed with autism, however, more often focused on the side of the screen with the toy. Another movie was similarly designed and showed a man blowing bubbles. Differences in eye gaze patterns for toddlers with autism were observed across several movies in the app.
Eye-tracking has been used previously to assess gaze patterns in people with autism, however, this has required special equipment and expertise to analyze the gaze patterns. This app, which takes less than 10 minutes to administer and uses the front-facing camera to record the child’s behavior, only requires an iPhone or iPad, making it readily accessible to primary care clinics and useable in home settings.
“This was the technical achievement many years in the making,” Chang said. “It required our research team to design the movies in a specific way to elicit and measure the gaze patterns of attention using only a hand-held device.
“It’s amazing how far we’ve come to achieve this ability to assess eye gaze without specialized equipment, using a common device many have in their pocket,” Chang said.
To test the device, the researchers included 993 toddlers ages 16-38 months; the average age was 21 months, which is when autism spectrum disorder (ASD) is often identified. Forty of the toddlers were diagnosed with ASD using gold-standard diagnostic methods.
Dawson said ongoing validation studies are underway. Additional studies with infants as young as 6 months are investigating whether the app-based assessment could identify differences in children who are later diagnosed with autism and neurodevelopmental disorders during the first year of life.
“We hope that this technology will eventually provide greater access to autism screening, which is an essential first step to intervention. Our long-term goal is to have a well-validated, easy-to-use app that providers and caregivers can download and use, either in a regular clinic or home setting,” Dawson said. “We have additional steps to go, but this study suggests it might one day be possible.”
In addition to Dawson and Chang, study authors include J. Matias Di Martino, Rachel Aiello, Jeffrey Baker, Kimberly Carpenter, Scott Compton, Naomi Davis, Brian Eichner, Steven Espinosa, Jacqueline Flowers, Lauren Franz, Martha Gagliano, Adrianne Harris, Jill Howard, Sam Perochon, Eliana M. Perrin, Pradeep Raj, Marina Spanos, Connor Sullivan, Barbara K. Walter, Scott H. Kollins and Guillermo Sapiro.
Funding: This work was primarily supported by the National Institutes of Health, Autism Centers of Excellence Award (P50HD093074) and the National Institute of Mental Health (R01MH121329, R01MH120093).
Additional support was from The Marcus Foundation, the Simons Foundation, the National Science Foundation (NSF- 1712867), the Office of Naval Research, (N00014-18-1-2143, N00014-20-1-233), the National Geospatial-Intelligence Agency (HM04761912010), Apple, Inc., Microsoft, Inc., Amazon Web Services and Google, Inc. The funders/sponsors had no role in the design and conduct of the study.
Authors Dawson, Chang, Sapiro, Baker, Carpenter, Chang, Espinosa and Harris developed technology related to the app that has been licensed to Apple, Inc., and both they and Duke University have benefited financially. Additional conflicts are disclosed in the study.
About this ASD and machine learning research news
Source: Duke University
Contact: Sarah Avery – Duke University
Image: The image is in the public domain
Original Research: Closed access.
“Computational Methods to Measure Patterns of Gaze in Toddlers With Autism Spectrum Disorder” by Geraldine Dawson et al. JAMA Pediatrics
Computational Methods to Measure Patterns of Gaze in Toddlers With Autism Spectrum Disorder
Atypical eye gaze is an early-emerging symptom of autism spectrum disorder (ASD) and holds promise for autism screening. Current eye-tracking methods are expensive and require special equipment and calibration. There is a need for scalable, feasible methods for measuring eye gaze.
Using computational methods based on computer vision analysis, we evaluated whether an app deployed on an iPhone or iPad that displayed strategically designed brief movies could elicit and quantify differences in eye-gaze patterns of toddlers with ASD vs typical development.
Design, Setting, and Participants
A prospective study in pediatric primary care clinics was conducted from December 2018 to March 2020, comparing toddlers with and without ASD. Caregivers of 1564 toddlers were invited to participate during a well-child visit. A total of 993 toddlers (63%) completed study measures. Enrollment criteria were aged 16 to 38 months, healthy, English- or Spanish-speaking caregiver, and toddler able to sit and view the app. Participants were screened with the Modified Checklist for Autism in Toddlers–Revised With Follow-up during routine care. Children were referred by their pediatrician for diagnostic evaluation based on results of the checklist or if the caregiver or pediatrician was concerned. Forty toddlers subsequently were diagnosed with ASD.
A mobile app displayed on a smartphone or tablet.
Main Outcomes and Measures
Computer vision analysis quantified eye-gaze patterns elicited by the app, which were compared between toddlers with ASD vs typical development.
Mean age of the sample was 21.1 months (range, 17.1-36.9 months), and 50.6% were boys, 59.8% White individuals, 16.5% Black individuals, 23.7% other race, and 16.9% Hispanic/Latino individuals. Distinctive eye-gaze patterns were detected in toddlers with ASD, characterized by reduced gaze to social stimuli and to salient social moments during the movies, and previously unknown deficits in coordination of gaze with speech sounds. The area under the receiver operating characteristic curve discriminating ASD vs non-ASD using multiple gaze features was 0.90 (95% CI, 0.82-0.97).
Conclusions and Relevance
The app reliably measured both known and new gaze biomarkers that distinguished toddlers with ASD vs typical development. These novel results may have potential for developing scalable autism screening tools, exportable to natural settings, and enabling data sets amenable to machine learning.