How the Brain Makes New Memories While Preserving the Old

Summary: Researchers have developed a new math model that could help explain how the brain can lay down new memories without wiping out old ones.

Source: Zuckerman Institute.

New mathamatical model helps to resolve this long-standing scientific question; offers framework to guide future studies of memory.

Columbia scientists have developed a new mathematical model that helps to explain how the human brain’s biological complexity allows it to lay down new memories without wiping out old ones — illustrating how the brain maintains the fidelity of memories for years, decades or even a lifetime. This model could help neuroscientists design more targeted studies of memory, and also spur advances in neuromorphic hardware — powerful computing systems inspired by the human brain.

This work is published online today in Nature Neuroscience.

“The brain is continually receiving, organizing and storing memories. These processes, which have been studied in countless experiments, are so complex that scientists have been developing mathematical models in order to fully understand them,” said Stefano Fusi, PhD, a principal investigator at Columbia’s Mortimer B. Zuckerman Mind Brain Behavior Institute, associate professor of neuroscience at Columbia University Medical Center and the paper’s senior author. “The model that we have developed finally explains why the biology and chemistry underlying memory are so complex — and how this complexity drives the brain’s ability to remember.”

Memories are widely believed to be stored in synapses, tiny structures on the surface of neurons. These synapses act as conduits, transmitting the information housed inside electrical pulses that normally pass from neuron to neuron. In the earliest memory models, the strength of electrical signals that passed through synapses was compared to a volume knob on a stereo; it dialed up to boost (or down to lower) the connection strength between neurons. This allowed for the formation of memories.

These models worked extremely well, as they accounted for enormous memory capacity. But they also posed an intriguing dilemma.

“The problem with a simple, dial-like model of how synapses function was that it was assumed their strength could be dialed up or down indefinitely,” said Dr. Fusi, who is also a member of Columbia’s Center for Theoretical Neuroscience. “But in the real world this can’t happen. Whether it’s the volume knob on a stereo, or any biological system, there has to be a physical limit to how much it could turn.”

When these limits were imposed, the memory capacity of these models collapsed. So Dr. Fusi, in collaboration with fellow Zuckerman Institute investigator Larry Abbott, PhD, an expert in mathematical modeling of the brain, offered an alternative: each synapse is more complex than just one dial, and instead should be described as a system with multiple dials.

In 2005, Drs. Fusi and Abbott published research explaining this idea. They described how different dials (perhaps representing clusters of molecules) within a synapse could operate in tandem to form new memories while protecting old ones. But even that model, the authors later realized, fell short of what they believed the brain — particularly the human brain — could hold.

“We came to realize that the various synaptic components, or dials, not only functioned at different timescales, but were also likely communicating with each other,” said Marcus Benna, PhD, an associate research scientist at Columbia’s Center for Theoretical Neuroscience and the first author of today’s Nature Neuroscience paper. “Once we added the communication between components to our model, the storage capacity increased by an enormous factor, becoming far more representative of what is achieved inside the living brain.”

Dr. Benna likened the components of this new model to a system of beakers connected to each other through a series of tubes.

“In a set of interconnected beakers, each filled with different amounts of water, the liquid will tend to flow between them such that the water levels become equalized. In our model, the beakers represent the various components within a synapse,” explained Dr. Benna. “Adding liquid to one of the beakers — or removing some of it — represents the encoding of new memories. Over time, the resulting flow of liquid will diffuse across the other beakers, corresponding to the long-term storage of memories.”

Drs. Benna and Fusi are hopeful that this work can help neuroscientists in the lab, by acting as a theoretical framework to guide future experiments — ultimately leading to a more complete and more detailed characterization of the brain.

Image shows connected tubes of liquid.
Connected tubs of liquid represent the components of brain cell connections that make memories. image is credited to Fusi Lab/Columbia University’s Mortimer B. Zuckerman Mind Brain Behavior Institute.

“While the synaptic basis of memory is well accepted, in no small part due to the work of Nobel laureate and Zuckerman Institute codirector Dr. Eric Kandel, clarifying how synapses support memories over many years without degradation has been extremely difficult,” said Dr. Abbott. “The work of Drs. Benna and Fusi should serve as a guide for researchers exploring the molecular complexity of the synapse.”

The technological implications of this model are also promising. Dr. Fusi has long been intrigued by neuromorphic hardware, computers that are designed to imitate a biological brain.

“Today, neuromorphic hardware is limited by memory capacity, which can be catastrophically low when these systems are designed to learn autonomously,” said Dr. Fusi. “Creating a better model of synaptic memory could help to solve this problem, speeding up the development of electronic devices that are both compact and energy efficient — and just as powerful as the human brain.”

About this neuroscience research article

Funding: This research was supported by the Gatsby Charitable Foundation, the Simons Foundation, the Swartz Foundation, the Kavli Foundation, the Grossman Foundation and Columbia’s Research Initiatives for Science and Engineering (RISE).

The authors report no financial or other conflicts of interest.

Source: Anne Holden – Zuckerman Institute
Image Source: image is credited to Fusi Lab/Columbia University’s Mortimer B. Zuckerman Mind Brain Behavior Institute.
Original Research: Abstract for “Computational principles of synaptic memory consolidation” by Marcus K Benna & Stefano Fusi in Nature Neuroscience. Published online October 3 2016 doi:10.1038/nn.4401

Cite This Article

[cbtabs][cbtab title=”MLA”]Zuckerman Institute. “How the Brain Makes New Memories While Preserving the Old.” NeuroscienceNews. NeuroscienceNews, 3 October 2016.
<>.[/cbtab][cbtab title=”APA”]Zuckerman Institute. (2016, October 3). How the Brain Makes New Memories While Preserving the Old. NeuroscienceNews. Retrieved October 3, 2016 from[/cbtab][cbtab title=”Chicago”]Zuckerman Institute. “How the Brain Makes New Memories While Preserving the Old.” (accessed October 3, 2016).[/cbtab][/cbtabs]


Computational principles of synaptic memory consolidation

Memories are stored and retained through complex, coupled processes operating on multiple timescales. To understand the computational principles behind these intricate networks of interactions, we construct a broad class of synaptic models that efficiently harness biological complexity to preserve numerous memories by protecting them against the adverse effects of overwriting. The memory capacity scales almost linearly with the number of synapses, which is a substantial improvement over the square root scaling of previous models. This was achieved by combining multiple dynamical processes that initially store memories in fast variables and then progressively transfer them to slower variables. Notably, the interactions between fast and slow variables are bidirectional. The proposed models are robust to parameter perturbations and can explain several properties of biological memory, including delayed expression of synaptic modifications, metaplasticity, and spacing effects.

“Computational principles of synaptic memory consolidation” by Marcus K Benna & Stefano Fusi in Nature Neuroscience. Published online October 3 2016 doi:10.1038/nn.4401

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.