Using electrocorticogram technology to capture brain waves, researchers found the meaning of what people imagine can be determined from brain wave patterns, even if the image differs from what a person is looking at.
Using ECoG and machine learning, researchers decoded spoken words and phrases in real-time from brain signals that control speech. The technology could eventually be used to help those who have lost vocal control to regain their voice.
When people listen to music, the neural tracking of the frontal lobe lags behind the temporal lobe, but during music recall, the frontal lobe precedes that of the temporal lobe. The findings demonstrate bottom-up and top-down processes in the cerebral cortex during music listening and recall. The study provides important insights into how the human brain processes music.