Summary: A new study challenges long held beliefs about how learning occurs. Researchers suggest learning occurs in dendrites that are in closer proximity to neurons, as opposed to occurring solely in synapses.
Source: Bar-Ilan University.
The brain is a complex network containing billions of neurons, where each of these neurons communicates simultaneously with thousands of other via their synapses (links). However, the neuron actually collects its many synaptic incoming signals through several extremely long ramified “arms” only, called dendritic trees.
In 1949 Donald Hebb’s pioneering work suggested that learning occurs in the brain by modifying the strength of the synapses, whereas neurons function as the computational elements in the brain. This has remained the common assumption until today.
Using new theoretical results and experiments on neuronal cultures, a group of scientists, led by Prof. Ido Kanter, of the Department of Physics and the Gonda (Goldschmied) Multidisciplinary Brain Research Center at Bar-Ilan University, has demonstrated that the central assumption for nearly 70 years that learning occurs only in the synapses is mistaken.
In an article published today in the journal Scientific Reports, the researchers go against conventional wisdom to show that learning is actually done by several dendrites, similar to the slow learning mechanism currently attributed to the synapses.
“The newly discovered process of learning in the dendrites occurs at a much faster rate than in the old scenario suggesting that learning occurs solely in the synapses. In this new dendritic learning process, there are a few adaptive parameters per neuron, in comparison to thousands of tiny and sensitive ones in the synaptic learning scenario,” said Prof. Kanter, whose research team includes Shira Sardi, Roni Vardi, Anton Sheinin, Amir Goldental and Herut Uzan.
The newly suggested learning scenario indicates that learning occurs in a few dendrites that are in much closer proximity to the neuron, as opposed to the previous notion. “Does it make sense to measure the quality of air we breathe via many tiny, distant satellite sensors at the elevation of a skyscraper, or by using one or several sensors in close proximity to the nose? Similarly, it is more efficient for the neuron to estimate its incoming signals close to its computational unit, the neuron,” says Kanter. Hebb’s theory has been so deeply rooted in the scientific world for 70 years that no one has ever proposed such a different approach. Moreover, synapses and dendrites are connected to the neuron in a series, so the exact localized site of the learning process seemed irrelevant.
Another important finding of the study is that weak synapses, previously assumed to be insignificant even though they comprise the majority of our brain, play an important role in the dynamics of our brain. They induce oscillations of the learning parameters rather than pushing them to unrealistic fixed extremes, as suggested in the current synaptic learning scenario.
The new learning scenario occurs in different sites of the brain and therefore calls for a reevaluation of current treatments for disordered brain functionality. Hence, the popular phrase “neurons that fire together wire together”, summarizing Donald Hebb’s 70-year-old hypothesis, must now be rephrased. In addition, the learning mechanism is at the basis of recent advanced machine learning and deep learning achievements. The change in the learning paradigm opens new horizons for different types of deep learning algorithms and artificial intelligence based applications imitating our brain functions, but with advanced features and at a much faster speed.
Funding: The study was supported by Israel Council for Higher Education.
Source: Elana Oberlander – Bar-Ilan University
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is credited to Prof. Ido Kanter.
Original Research: Open access research in Scientific Reports.
Adaptive nodes enrich nonlinear cooperative learning beyond traditional adaptation by links
Physical models typically assume time-independent interactions, whereas neural networks and machine learning incorporate interactions that function as adjustable parameters. Here we demonstrate a new type of abundant cooperative nonlinear dynamics where learning is attributed solely to the nodes, instead of the network links which their number is significantly larger. The nodal, neuronal, fast adaptation follows its relative anisotropic (dendritic) input timings, as indicated experimentally, similarly to the slow learning mechanism currently attributed to the links, synapses. It represents a non-local learning rule, where effectively many incoming links to a node concurrently undergo the same adaptation. The network dynamics is now counterintuitively governed by the weak links, which previously were assumed to be insignificant. This cooperative nonlinear dynamic adaptation presents a self-controlled mechanism to prevent divergence or vanishing of the learning parameters, as opposed to learning by links, and also supports self-oscillations of the effective learning parameters. It hints on a hierarchical computational complexity of nodes, following their number of anisotropic inputs and opens new horizons for advanced deep learning algorithms and artificial intelligence based applications, as well as a new mechanism for enhanced and fast learning by neural networks.