A neuron and dendrites are shown here
How do neurons process information? Neurons are known to break down an incoming electrical signal into sub-units. Now, researchers at Blue Brain have discovered that dendrites, the neuron’s tree-like receptors, work together – dynamically and depending on the workload – for learning. The findings further our understanding of how we think and may inspire new algorithms for artificial intelligence. The image is credited to EPFL.

The way a single neuron processes information is never the same

Summary: When a neuron receives information, dendrites functionally work together to adjust for the complexity of the input.

Source: EPFL

In a paper published in the journal Cell Reports, researchers at EPFL’s Blue Brain Project, a Swiss Brain Research Initiative, have developed a new framework to work out how a single neuron in the brain operates.

The analysis was performed using cells from the Blue Brain’s virtual rodent cortex. The researchers expect other types of neurons – non-cortical or human – to operate in the same way.

Their results show that when a neuron receives input, the branches of the elaborate tree-like receptors extending from the neuron, known as dendrites, functionally work together in a way that is adjusted to the complexity of the input.

The strength of a synapse determines how strongly a neuron feels an electric signal coming from other neurons, and the act of learning changes this strength. By analyzing the “connectivity matrix” that determines how these synapses communicate with each other, the algorithm establishes when and where synapses group into independent learning units from the structural and electrical properties of dendrites. In other words, the new algorithm determines how the dendrites of neurons functionally break up into separate computing units and find that they work together dynamically, depending on the workload, to process information.

The researchers liken their results to the functioning of computing technology already implemented today. This newly observed dendritic functionality acts like parallel computing units meaning that a neuron is able to process different aspects of the input in parallel, like supercomputers. Each of the parallel computing units can independently learn to adjust its output, much like the nodes in deep learning networks used in artificial intelligence (AI) models today. Comparable to cloud computing, a neuron dynamically breaks up into the number of separate computing units demanded by the workload of the input.

“In the Blue Brain Project, this mathematical approach helps to ascertain functionally relevant clusters of neuronal input which are inputs that feed into the same parallel processing unit. This then enables us to determine the level of complexity at which to model cortical networks as we digitally reconstruct and simulate the brain,” explains Marc-Oliver Gewaltig, Section Manager in Blue Brain’s Simulation Neuroscience Division.

Parallel computing units of neurons can independently learn to adjust their output

Additionally, the research reveals how these parallel processing units influence learning, i.e. the change in connection strength between different neurons. The way a neuron learns depends on the number and location of parallel processors, which in turn depend on the signals arriving from other neurons. For instance, certain synapses that do not learn independently when the neuron’s input level is low, start to learn independently when the input levels are higher.

To date traditional learning algorithms (such as those currently used in A.I. applications) assume that neurons are static units that merely integrate and re-scale incoming signals. By contrast, the results show that the number and size of the independent subunits can be controlled by balanced input or shunting inhibition. The researchers propose that this temporary control of the compartmentalization constitutes a powerful mechanism for the branch-specific learning of input features.

A neuron and dendrites are shown here
How do neurons process information? Neurons are known to break down an incoming electrical signal into sub-units. Now, researchers at Blue Brain have discovered that dendrites, the neuron’s tree-like receptors, work together – dynamically and depending on the workload – for learning. The findings further our understanding of how we think and may inspire new algorithms for artificial intelligence. The image is credited to EPFL.

“The method finds that in many brain states, neurons have far fewer parallel processors than expected from dendritic branch patterns. Thus, many synapses appear to be in “grey zones” where they do not belong to any processing unit,” explains lead scientist and first author Willem Wybo. “However, in the brain, neurons receive varying levels of background input and our results show that the number of parallel processors varies with the level of background input, indicating that the same neuron might have different computational roles in different brain states.”

“We are particularly excited about this observation since it sheds a new light on the role of up/down states in the brain and it also provides a reason as to why cortical inhibition is so location-specific. With the new insights, we can start looking for algorithms that exploit the rapid changes in pairing between processing units, offering us more insight into the fundamental question of how the brain computes,” concludes Gewaltig.

Funding: Funding was received from the ETH Domain for the Blue Brain Project (BBP) and the European Union Seventh Framework Program (FP7/2007- 2013) under grant agreements no. FP7-26992115 (BrainScaleS), FP7-604102 (The Human Brain Project), as well as EU grant agreement no. 720270 (HBP SGA1).

About this neuroscience research article

Source:
EPFL
Media Contacts:
Hillary Sanctuary – EPFL
Image Source:
The image is credited to EPFL.

Original Research: Open access
“Electrical Compartmentalization in Neurons”. Willem A.M. Wybo, Benjamin Torben-Nielsen, Thomas Nevian, Marc-Oliver Gewaltig.
Cell Reports. doi:10.1016/j.celrep.2019.01.074

Abstract

Electrical Compartmentalization in Neurons

Highlights
• Neural computation relies on compartmentalized dendrites to discern inputs
• A method is described to systematically derive the degree of compartmentalization
• There are substantially fewer functional compartments than dendritic branches
• Compartmentalization is dynamic and can be tuned by synaptic inputs

Summary
The dendritic tree of neurons plays an important role in information processing in the brain. While it is thought that dendrites require independent subunits to perform most of their computations, it is still not understood how they compartmentalize into functional subunits. Here, we show how these subunits can be deduced from the properties of dendrites. We devised a formalism that links the dendritic arborization to an impedance-based tree graph and show how the topology of this graph reveals independent subunits. This analysis reveals that cooperativity between synapses decreases slowly with increasing electrical separation and thus that few independent subunits coexist. We nevertheless find that balanced inputs or shunting inhibition can modify this topology and increase the number and size of the subunits in a context-dependent manner. We also find that this dynamic recompartmentalization can enable branch-specific learning of stimulus features. Analysis of dendritic patch-clamp recording experiments confirmed our theoretical predictions.

Feel free to share this Elecrophysiology News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.