Summary: Researchers have developed a portable fNIR system to measure a person’s brain activity as they navigate around a location while wearing smart eyewear.
Source: Drexel University.
Researchers use functional near-infrared spectroscopy to measure mental workload as subjects navigate a college campus.
“Smart” eyewear — that can integrate augmented reality with your own, feed you live information about your surroundings and even be used in the operating room — is no longer the stuff of science fiction.
Wearable displays also have the potential to enhance cognitive ergonomics, or more simply, make it less mentally taxing to complete certain tasks. But before technologies like Google Glass become a part of daily life, engineers need a way to monitor exactly how they affect the brain in everyday situations.
At Drexel University, researchers have developed a portable system that can do just that. The system uses functional near-infrared spectroscopy, or fNIRS, to measure a person’s brain activity.
The applications for fNIRS are seemingly endless — from training air traffic controllers and drone operators, to studying how students with disabilities learn best, or why different people are more receptive to certain Super Bowl commercials.
“This is a new trend called neuroergonomics. It’s the study of the brain at work — cognitive neuroscience plus human factors,” said Hasan Ayaz, PhD, associate research professor in the School of Biomedical Engineering, Science and Health Systems and a member of Drexel’s CONQUER Collaborative. The phrase was coined by the late Raja Parasuraman, a former professor at George Mason University and study co-author.
Until now, most studies involving fNIRS took place indoors. Though participants wearing the system could move around freely while being monitored, they were still observed within laboratory confines.
A group of Drexel biomedical engineers, in collaboration with researchers at George Mason University, have now brought their portable fNIRS system “into the wild.” In their study, published this summer in Frontiers in Human Neuroscience, the researchers successfully measured the brain activity of participants navigating a college campus outdoors.
The researchers wanted to compare one group of participants navigating campus with Google Glass to another group using Google Maps on an iPhone. Their goal was to measure mental workload (how hard the brain is working) and situation awareness (the perception of environmental elements), in order to see which device was less mentally taxing.
They found that overall, users using Google Glass had a higher situation awareness and lower mental workload than their peers navigating with an iPhone.
However, the researchers also found that users wearing Google Glass fell victim to “cognitive tunneling,” meaning they focused so much more of their attention to the display itself, that they easily ignored other aspects of their surroundings.
“What we were able to see were the strengths and weaknesses of both. Now that we know we are able to capture that, we can now improve their design,” said Ayaz, the study’s principal investigator. “This opens up all new areas of applications. We will be able to analyze how the brain is functioning during all of these natural activities that you cannot replicate in artificial lab settings.”
fNIRS is a way to measure oxygenation levels in the prefrontal cortex — the part of the brain responsible for complex behaviors like decision making, cognitive expression and personality development. Greater activity in this area of the brain signals that a person is a novice, and therefore must work harder, at an activity. When someone masters a skill, the processing of information moves toward the back regions of the brain.
In the past, researchers had to use secondary tasks to measure the “user-friendliness” of an augmented reality product, like Google Glass. For instance, while a person was navigating with a maps application, they would be asked to recall a series of sounds played to them through headphones. If their responses were inaccurate, this implied that their brain had to work harder to pay attention to the primary task at hand.
For comparison, the Drexel researchers also used secondary tasks to measure mental workload and situation awareness. However, they found that these tasks were intrusive and ultimately unnecessary. The fNIRS system was able to accurately assess brain activity during the task and examine differences between a hand-held display and wearable display.
“We observed greater mental capacity reserves for head-mounted display users during ambulatory navigation based on behavioral and neuro-metabolic evidence. However, we also observed evidence that some of the advantages of head-mounted displays are overshadowed by their suboptimal display symbology, which can be overly attention grabbing,” said Ryan McKendrick, PhD, the study’s lead author and now a cognitive scientist at Northrop Grumman Corporation.
Since the research team found that Google Glass users experienced some cognitive tunneling while navigating, they suggest that future studies identify other brain biomarkers induced by this “blindness” to the outside world. By identifying cognitive tunneling biomarkers, engineers could “greatly advance display design for navigation, training and other tasks” that wearable displays are expected to enhance.
About this psychology research article
Funding: The study was funded with the support of the Air Force Office of Scientific Research, Center of Excellence in Neuroergonomics, Technology, and Cognition.
Source: Lauren Ingeno – Drexel University Image Source: This NeuroscienceNews.com image is credited to Drexel University. Original Research: Full open access research for “Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy” by Ryan McKendrick, Raja Parasuraman, Rabia Murtza, Alice Formwalt, Wendy Baccus, Martin Paczynski and Hasan Ayaz in Frontiers in Human Neuroscience. Published online May 18 2016 doi:10.3389/fnhum.2016.00216
Cite This NeuroscienceNews.com Article
[cbtabs][cbtab title=”MLA”]Drexel University. “This Is Your Brain on Google Glass.” NeuroscienceNews. NeuroscienceNews, 11 August 2016. <https://neurosciencenews.com/neuroscience-google-glass-4826/>.[/cbtab][cbtab title=”APA”]Drexel University. (2016, August 11). This Is Your Brain on Google Glass. NeuroscienceNews. Retrieved August 11, 2016 from https://neurosciencenews.com/neuroscience-google-glass-4826/[/cbtab][cbtab title=”Chicago”]Drexel University. “This Is Your Brain on Google Glass.” https://neurosciencenews.com/neuroscience-google-glass-4826/ (accessed August 11, 2016).[/cbtab][/cbtabs]
Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy
Highly mobile computing devices promise to improve quality of life, productivity, and performance. Increased situation awareness and reduced mental workload are two potential means by which this can be accomplished. However, it is difficult to measure these concepts in the “wild”. We employed ultra-portable battery operated and wireless functional near infrared spectroscopy (fNIRS) to non-invasively measure hemodynamic changes in the brain’s Prefrontal cortex (PFC). Measurements were taken during navigation of a college campus with either a hand-held display, or an Augmented reality wearable display (ARWD). Hemodynamic measures were also paired with secondary tasks of visual perception and auditory working memory to provide behavioral assessment of situation awareness and mental workload. Navigating with an augmented reality wearable display produced the least workload during the auditory working memory task, and a trend for improved situation awareness in our measures of prefrontal hemodynamics. The hemodynamics associated with errors were also different between the two devices. Errors with an augmented reality wearable display were associated with increased prefrontal activity and the opposite was observed for the hand-held display. This suggests that the cognitive mechanisms underlying errors between the two devices differ. These findings show fNIRS is a valuable tool for assessing new technology in ecologically valid settings and that ARWDs offer benefits with regards to mental workload while navigating, and potentially superior situation awareness with improved display design.
“Into the Wild: Neuroergonomic Differentiation of Hand-Held and Augmented Reality Wearable Displays during Outdoor Navigation with Functional Near Infrared Spectroscopy” by Ryan McKendrick, Raja Parasuraman, Rabia Murtza, Alice Formwalt, Wendy Baccus, Martin Paczynski and Hasan Ayaz in Frontiers in Human Neuroscience. Published online May 18 2016 doi:10.3389/fnhum.2016.00216