This shows a brain.
Salk scientists have established a new method to explore synaptic strength, precision of plasticity, and amount of information storage. Credit: Neuroscience News

The Brain Stores 10x More Info Than Thought

Summary: Researchers developed a method to measure synaptic strength, precision of plasticity, and information storage in the brain. Using information theory, researchers found that synapses can store 10 times more information than previously believed.

The findings enhance understanding of learning, memory, and how these processes evolve or deteriorate. This breakthrough could propel research on neurodevelopmental and neurodegenerative disorders.

Key Facts:

  • Synaptic Plasticity: Study measures synaptic strength, plasticity, and information storage using information theory.
  • Increased Storage: Findings show synapses can store 10 times more information than previously thought.
  • Research Impact: This method can advance studies on learning, memory, and brain disorders like Alzheimer’s.

Source: Salk Institute

With each flip you make through a deck of vocabulary word flashcards, their definitions come more quickly, more easily. This process of learning and remembering new information strengthens important connections in your brain.

Recalling those new words and definitions more easily with practice is evidence that those neural connections, called synapses, can grow stronger or weaker over time—a feature known as synaptic plasticity.

Quantifying the dynamics of individual synapses can be a challenge for neuroscientists, but recent computational innovations from the Salk Institute may be changing that—and revealing new insights about the brain along the way.

To understand how the brain learns and retains information, scientists try to quantify how much stronger a synapse has gotten through learning, and how much stronger it can get.

Synaptic strength can be measured by looking at the physical characteristics of synapses, but it’s much more difficult to measure the precision of plasticity (whether synapses grow weaker or stronger by a consistent amount) and the amount of information a synapse can store.

Salk scientists have established a new method to explore synaptic strength, precision of plasticity, and amount of information storage. Quantifying these three synaptic features can improve scientific understanding of how humans learn and remember, as well as how those processes evolve over time or deteriorate with age or disease.

The findings were published in Neural Computation on April 23, 2024.

“We’re getting better at identifying exactly where and how individual neurons are connected to each other, but we still have a lot to learn about the dynamics of those connections,” says Professor Terrence Sejnowski, senior author of the study and holder of the Francis Crick Chair at Salk.

“We have now created a technique for studying the strength of synapses, the precision with which neurons modulate that strength, and the amount of information synapses are capable of storing—leading us to find that our brain can store 10 times more information than we previously thought.”

When a message travels through the brain, it hops from neuron to neuron, flowing from the end of one neuron into the outstretched tendrils, called dendrites, of another.

Each dendrite on a neuron is covered with tiny bulbous appendages, called dendritic spines, and at the end of each dendritic spine is the synapse—a tiny space where the two cells meet, and an electrochemical signal is transmitted. Different synapses are activated to send different messages.

Some messages activate pairs of synapses, which live near one another on the same dendrite. These synapse pairs are a fantastic research tool—if two synapses have identical activation histories, scientists can compare the strength of those synapses to draw conclusions about the precision of plasticity.

Since the same type and amount of information has passed through these two synapses, did they each change in strength by the same amount? If so, their precision of plasticity is high.

The Salk team applied concepts from information theory to analyze synapse pairs from a rat hippocampus—a part of the brain involved in learning and memory—for strength, plasticity, and precision of plasticity.

Information theory is a sophisticated mathematical way of understanding information processing as an input traveling through a noisy channel and being reconstructed on the other end.

Crucially, unlike methods used in the past, information theory accounts for the noisiness of the brain’s many signals and cells, in addition to offering a discrete unit of information—a bit—to measure the amount of information stored at a synapse.

“We divided up synapses by strength, of which there were 24 possible categories, then compared special synapse pairs to determine how precisely each synapses’ strength is modulated,” says Mohammad Samavat, first author of the study and a postdoctoral researcher in Sejnowski’s lab.

“We were excited to find that the pairs had very similar dendritic spine sizes and synaptic strengths, meaning the brain is highly precise when it makes synapses weaker or stronger over time.”

In addition to noting the similarities in synapse strength within these pairs, which translates to a high level of precision of plasticity, the team also measured the amount of information held in each of the 24 strength categories. Despite differences in the size of each dendritic spine, each of the 24 synaptic strength categories held a similar amount (between 4.1 and 4.6 bits) of information.

Compared to older techniques, this new approach using information theory is (1) more thorough, accounting for 10 times more information storage in the brain than was previously assumed, and (2) scalable, meaning it can be applied to diverse and large datasets to gather information about other synapses.

“This technique is going to be a tremendous help for neuroscientists,” says Kristen Harris, a professor at the University of Texas at Austin and an author of the study.

“Having this detailed look into synaptic strength and plasticity could really propel research on learning and memory, and we can use it to explore these processes in all different parts of human brains, animal brains, young brains, and old brains.”

Sejnowski says future work by projects like the National Institutes of Health’s BRAIN Initiative, which established a human brain cell atlas in October 2023, will benefit from this new tool.

In addition to scientists who catalog brain cell types and behaviors, the technique is exciting for those studying when information storage goes awry—like in Alzheimer’s disease.

In years to come, researchers around the world could use this technique to make exciting discoveries about the human brain’s ability to learn new skills, remember day-to-day actions, and store information short- and long-term.

About this synaptic plasticity research news

Author: Terrence Sejnowski
Source: Salk Institute
Contact: Terrence Sejnowski – Salk Institute
Image: The image is credited to Neuroscience News

Original Research: Open access.
Synaptic Information Storage Capacity Measured With Information Theory” by Terrence Sejnowski et al. Neural Computation


Abstract

Synaptic Information Storage Capacity Measured With Information Theory

Variation in the strength of synapses can be quantified by measuring the anatomical properties of synapses. Quantifying precision of synaptic plasticity is fundamental to understanding information storage and retrieval in neural circuits.

Synapses from the same axon onto the same dendrite have a common history of coactivation, making them ideal candidates for determining the precision of synaptic plasticity based on the similarity of their physical dimensions.

Here, the precision and amount of information stored in synapse dimensions were quantified with Shannon information theory, expanding prior analysis that used signal detection theory (Bartol et al., 2015).

The two methods were compared using dendritic spine head volumes in the middle of the stratum radiatum of hippocampal area CA1 as well-defined measures of synaptic strength.

Information theory delineated the number of distinguishable synaptic strengths based on nonoverlapping bins of dendritic spine head volumes. Shannon entropy was applied to measure synaptic information storage capacity (SISC) and resulted in a lower bound of 4.1 bits and upper bound of 4.59 bits of information based on 24 distinguishable sizes.

We further compared the distribution of distinguishable sizes and a uniform distribution using Kullback-Leibler divergence and discovered that there was a nearly uniform distribution of spine head volumes across the sizes, suggesting optimal use of the distinguishable values.

Thus, SISC provides a new analytical measure that can be generalized to probe synaptic strengths and capacity for plasticity in different brain regions of different species and among animals raised in different conditions or during learning. How brain diseases and disorders affect the precision of synaptic plasticity can also be probed.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.