Neuroscience research articles are provided.
What is neuroscience? Neuroscience is the scientific study of nervous systems. Neuroscience can involve research from many branches of science including those involving neurology, brain science, neurobiology, psychology, computer science, artificial intelligence, statistics, prosthetics, neuroimaging, engineering, medicine, physics, mathematics, pharmacology, electrophysiology, biology, robotics and technology.
– These articles focus mainly on neurology research. – What is neurology? – Definition of neurology: a science involved in the study of the nervous systems, especially of the diseases and disorders affecting them. – Neurology research can include information involving brain research, neurological disorders, medicine, brain cancer, peripheral nervous systems, central nervous systems, nerve damage, brain tumors, seizures, neurosurgery, electrophysiology, BMI, brain injuries, paralysis and spinal cord treatments.
What is Psychology? Definition of Psychology: Psychology is the study of behavior in an individual, or group. Psychology news articles are listed below.
Artificial Intelligence articles involve programming, neural engineering, artificial neural networks, artificial life, a-life, floyds, boids, emergence, machine learning, neuralbots, neuralrobotics, computational neuroscience and more involving A.I. research.
Robotics articles will cover robotics research press releases. Robotics news from universities, labs, researchers, engineers, students, high schools, conventions, competitions and more are posted and welcome.
Genetics articles related to neuroscience research will be listed here.
Neurotechnology research articles deal with robotics, AI, deep learning, machine learning, Brain Computer Interfaces, neuroprosthetics, neural implants and more. Read the latest neurotech news articles below.
As Baby Boomers age, many experience difficulty in hearing and understanding conversations in noisy environments such as restaurants. People who are hearing-impaired and who wear hearing aids or cochlear implants are even more severely impacted. Researchers know that the ability to locate the source of a sound with ease is vital to hear well in these types of situations, but much more information is needed to understand how hearing works to be able to design devices that work better in noisy environment.
Researchers from the Eaton-Peabody Laboratories of the Massachusetts Eye and Ear, Harvard Medical School, and Research Laboratory of Electronics, Massachusetts Institute of Technology have gained new insight into how localized hearing works in the brain. Their research is published in the Oct. 2, 2013 issue of the Journal of Neuroscience.
“Most people are able to locate the source of a sound with ease, for example, a snapping twig on the left, or a honking horn on the right. However this is actually a difficult problem for the brain to solve,” said Mitchell L. Day, Ph.D., investigator in the Eaton-Peabody Laboratories at Mass. Eye and Ear and instructor of Otology and Laryngology at Harvard Medical School “The higher levels of the brain that decide the direction a sound is coming from do not have access to the actual sound, but only the representation of that sound in the electrical activity of neurons at lower levels in the brain. How higher levels of the brain use information contained in the electrical activity of these lower-level neurons to create the perception of sound location is not known.”
In the experiment, researchers recorded the electrical activity of individual neurons in an essential lower-level auditory brain area called the inferior colliculus (IC) while an animal listened to sounds coming from different directions. They found that the location of a sound source could be accurately predicted from the pattern of activation across a population of less than 100 IC neurons – i.e., a particular pattern of IC activation indicated a particular location in space. Researchers further found that the pattern of IC activation could correctly distinguish whether there was a single sound source present or two sources coming from different directions – i.e., the pattern of IC activation could segregate concurrent sources.
“Our results show that higher levels of the brain may be able to accurately segregate and localize sound sources based on the detection of patterns in a relatively small population of IC neurons,” said Dr. Day. “We hope to learn more so that someday we can design devices that work better in noisy environments.”
Notes about this neuroscience research
This work was funded by National Institute on Deafness and Other Communication Disorders grants RO1 DC002258 and P30 DC005209. The paper was co-authored by Mitchell L. Day and Bertrand Delgutte.
Contact: Mary Leach – Massachusetts Eye and Ear Infirmary Source: Massachusetts Eye and Ear Infirmary Image Source: The Inferior Colliculus image is credited to Gray’s Anatomy. The image is in the public domain. Original Research: The research paper will be published in the Journal of Neuroscience on October 2 2013. We will provide a link to the research when available.