Summary: A newly designed dry sensor that can measure brain activity may someday enable mind control of robotic systems.
Source: American Chemical Society
It sounds like something from science fiction: Don a specialized, electronic headband and control a robot using your mind. But now, recent research published in ACS Applied Nano Materials has taken a step toward making this a reality.
By designing a special, 3D-patterned structure that doesn’t rely on sticky conductive gels, the team has created “dry” sensors that can measure the brain’s electrical activity, even amidst hair and the bumps and curves of the head.
Physicians monitor electrical signals from the brain with electroencephalography (EEG), in which specialized electrodes are either implanted into or placed on the surface of the head. EEG helps diagnose neurological disorders, but it can also be incorporated into “brain-machine interfaces,” which use brain waves to control an external device, such as a prosthetic limb, robot or even a video game.
Most non-invasive versions involve the use of “wet” sensors, which are stuck onto the head with a gloopy gel that can irritate the scalp and sometimes trigger allergic reactions.
As an alternative, researchers have been developing “dry” sensors that don’t require gels, but thus far none have worked as well as the gold-standard wet variety.
Although nanomaterials like graphene could be a suitable option, their flat and typically flaky nature make them incompatible with the uneven curves of the human head, particularly over long periods. So, Francesca Iacopi and colleagues wanted to create a 3D, graphene-based sensor based on polycrystalline graphene that could accurately monitor brain activity without any stickiness.
The team created several 3D graphene-coated structures with different shapes and patterns, each around 10 µm thick. Of the shapes tested, a hexagonal pattern worked the best on the curvy, hairy surface of the occipital region — the spot at the base of the head where the brain’s visual cortex is located.
The team incorporated eight of these sensors into an elastic headband, which held them against the back of the head. When combined with an augmented reality headset displaying visual cues, the electrodes could detect which cue was being viewed, then work with a computer to interpret the signals into commands that controlled the motion of a four-legged robot — completely hands-free.
Though the new electrodes didn’t yet work quite as well as the wet sensors, the researchers say that this work represents a first step toward developing robust, easily implemented dry sensors to help expand the applications of brain-machine interfaces.
Funding: The authors acknowledge funding from the Defence Innovation Hub of the Australian Government and support from the Australian National Fabrication Facility of the University of Technology Sydney and the Research & Prototype Foundry at the University of Sydney Nano Institute.
Noninvasive Sensors for Brain–Machine Interfaces Based on Micropatterned Epitaxial Graphene
The availability of accurate and reliable dry sensors for electroencephalography (EEG) is vital to enable large-scale deployment of brain–machine interfaces (BMIs). However, dry sensors invariably show poorer performance compared to the gold standard Ag/AgCl wet sensors.
The loss of performance with dry sensors is even more evident when monitoring the signal from hairy and curved areas of the scalp, requiring the use of bulky and uncomfortable acicular sensors.
This work demonstrates three-dimensional micropatterned sensors based on a subnanometer-thick epitaxial graphene for detecting the EEG signal from the challenging occipital region of the scalp.
The occipital region, corresponding to the visual cortex of the brain, is key to the implementation of BMIs based on the common steady-state visually evoked potential paradigm.
The patterned epitaxial graphene sensors show efficient on-skin contact with low impedance and can achieve comparable signal-to-noise ratios against wet sensors.
Using these sensors, we have also demonstrated hands-free communication with a quadruped robot through brain activity.