The human brain may be capable of storing 10 times more information than previously thought, according to a new study that could lead to a better understanding of memory decline with ageing and disease.
Previous studies have shown that learning and remembering new information strengthens nerve connections in the brain called synapses which can grow stronger or weaker over time, affecting memory.
However, quantifying the strength of individual synapses has been a challenge for neuroscientists.
It is especially difficult to estimate the precise amount by which connections between nerves grow weaker or stronger and the amount of data they store.
But a new technique, described in the journal Neural Computation, offers fresh insights into synaptic strength, precision of plasticity, and the amount of information these nerve connections can store.
The technique could help quantify the strength of synapses and enhance our understanding of how humans learn and remember, and how those processes change with age or disease.
“We’re getting better at identifying exactly where and how individual neurons are connected to each other,” study co-author Terrence Sejnowski said.
“We have now created a technique for studying the strength of synapses, the precision with which neurons modulate that strength, and the amount of information synapses are capable of storing — leading us to find that our brain can store 10 times more information than we previously thought.”
When a message is transmitted across the brain, it hops as an electric signal from neuron to neuron via a tiny space where the two cells meet called a synapse.
The signal travels through synapses via chemical messengers, and different synapses communicate at different strengths – some may be weak whispers while others may be loud chemical signals.
Specific synapses strengthen to enable the retention of new information in a process called synaptic plasticity.
The human brain has over 100 trillion synapses and synaptic strength is known to play a role in how they store memories.
Conditions like Alzheimer’s tend to weaken synapses and reduce the ability of the brain to store information.
In the new study, scientists assessed pairs of synapses from a rat hippocampus, the key brain region for learning and memory.
They applied a method traditionally used to study computers to determine how many bits of information the synapses could hold.
They found that each rat hippocampal synapse can store between 4.1 and 4.6 bits of information.
This means the human brain may be capable of holding at least a petabyte of information, equivalent to the data contained on the entire internet.
“Having this detailed look into synaptic strength and plasticity could really propel research on learning and memory, and we can use it to explore these processes in all different parts of human brains, animal brains, young brains, and old brains,” study co-author Kristen Harris said.
“Ultimately, the outcomes should give new insight into how disrupted synapses result in cognitive decline during ageing or as a consequence of neurological disorders.”