Neurotechnology Research

A new study reveals that most people fail to recognize racial bias embedded in AI systems, even when it is visible in the training data. The research shows that artificial intelligence trained on imbalanced datasets—such as happy white faces and sad Black faces—learns to associate race with emotion, perpetuating biased performance.
New research reveals that astrocytes — not neurons — are responsible for stabilizing emotional memories by re-engaging during recall. After an emotionally intense event, such as fear, specific astrocytes become biologically tagged with adrenoreceptors, making them responsive when the memory is later reactivated.
New research shows that our own physical movements can alter how we perceive emotions on others’ faces. In a virtual reality experiment, participants were more likely to judge a face as angry when they actively moved away from it, compared to when the face moved away from them.
A new study reveals that serotonin levels in the brain, measured using a simple EEG-based test, can predict who will experience sexual side effects from SSRI antidepressants. Researchers found that people with higher serotonin activity before treatment were significantly more likely to have difficulty reaching orgasm during antidepressant use.
Researchers have identified a rare type of brain cell that may drive the chronic inflammation and neurodegeneration seen in progressive multiple sclerosis (MS). These cells, called disease-associated radial glia-like (DARG) cells, appear six times more often in patients with progressive MS than in healthy individuals.

Brain Computer Interface news involves science using BCI, neural interfaces, brain implant technologies, EEG control of robotics, neurobotics and more.

Researchers have created a noninvasive brain-computer interface enhanced with artificial intelligence, enabling users to control a robotic arm or cursor with greater accuracy and speed. The system translates brain signals from EEG recordings into movement commands, while an AI camera interprets user intent in real time.
Scientists have, for the first time, decoded inner speech—silent thoughts of words—on command using brain-computer interface technology, achieving up to 74% accuracy. By recording neural activity from participants with severe paralysis, the team found that inner speech and attempted speech share overlapping brain activity patterns, though inner speech signals are weaker.
Researchers have developed a brain-computer interface that can synthesize natural-sounding speech from brain activity in near real time, restoring a voice to people with severe paralysis. The system decodes signals from the motor cortex and uses AI to transform them into audible speech with minimal delay—less than one second.
Researchers enabled a silent person to produce speech using thought alone. Depth electrodes in the participant's brain transmitted electrical signals to a computer, which then vocalized imagined syllables. This technology offers hope for paralyzed individuals to regain speech. The study marks a significant step towards brain-computer interfaces for voluntary communication.
Researchers are trialing a novel brain-computer interface (BCI) with the potential to transform neurosurgical procedures and patient care. The Layer 7 Cortical Interface, boasting 1,024 electrodes for unparalleled brain activity mapping, promises new insights into neurological and psychiatric conditions.
Elon Musk announces the first human has been successfully implanted with Neuralink's brain chip, named Telepathy, aiming to allow severe physically disabled individuals to control devices via thought. The FDA-approved trial focuses on the implant's potential for movement control, with the patient reportedly recovering well and showing promising initial results.
Researchers achieved a breakthrough in converting brain signals to audible speech with up to 100% accuracy. The team used brain implants and artificial intelligence to directly map brain activity to speech in patients with epilepsy.

The latest science news involving neural prosthetics, arm and leg prostheses, bionics, biomechanical engineering, BCIs, robotics, EEG control of prosthetics, visual aids, auditory aids for hearing and more is here. You can also

A next-generation neuroprosthetic hand that restores a sense of touch is moving into a pivotal home-use clinical trial. The “iSens” system uses implanted electrodes to read muscle intent and stimulate nerves, relaying fingertip sensations to the brain so the prosthesis feels embodied.

More Neurotech News

Browse all of our neurotechnology articles over the years. Remember you can click on the tags or search for specific articles.

This shows different regions of the mouse brain, as produced by the AI.
Researchers have created one of the most detailed maps of the mouse brain ever made, using artificial intelligence to reveal 1,300 distinct regions and subregions. The AI model, called CellTransformer, identified new brain areas that had never been charted before, providing an unprecedented view of brain organization.
This shows a holographic brain.
Researchers have developed a groundbreaking ultrasound device that can stimulate several precise points in the brain at the same time, marking a leap forward in non-invasive neuromodulation. Unlike earlier single-spot approaches, this technology uses lower-intensity ultrasound pulses that reduce risks of overheating and uncontrolled brain excitation.
This shows a speech therapist and a child.
A large-scale clinical trial tested biofeedback-based therapy for children with residual speech sound disorder, a condition where pronunciation errors persist past age eight. Biofeedback methods—like ultrasound imaging or acoustic displays—gave children visual cues to adjust their tongue movements, accelerating progress with difficult sounds such as “r.”