New research out of the Salk Institute indicates that the total memory capacity of the brain is in the petabyte range, as much as the entire web. The research was published in the journal eLife.
The study uncovers critical insights into the size of neural connections, putting the memory capacity of the brain far higher than was commonly thought. Moreover, the work addresses questions as to how the brain is so energy efficient and could help engineers build more powerful computers that use less energy.
“We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power. Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web,” said Salk Professor Terry Sejnowski. “This is a real bombshell in the field of neuroscience.”
Scientists used advanced microscopy and computer algorithms to reconstruct hippocampus brain tissue down to the nanomolecular level. Analysis showed that synapses of all sizes could vary in increments of as little as eight percent between sizes, within a factor of 60. Using the new found data, the team determined there could be about 26 categories of sizes of synapses, rather than just a few (which was the previous conventional thinking). This meant that the brains memory capacity was likely grossly under estimated, since the capacity of neurons is dependent upon synapse size.
The research also suggests that the incredible efficiency of the brain is because synapses are constantly adjusting themselves based on the signals received. The efficiency persists even though signals from on neuron to another typically only active 10 to 20 percent of the time, and the brain itself only uses as much power as a dim light bulb.
Analysis indicated that for the smallest synapses, about 1,500 events (20 minutes) cause a change in their size/ability and for the largest synapses, only a couple hundred signaling events (1 to 2 minutes) cause a change in size. A measurement of change in size/ability was noted if the synapse grew or shrunk by 8 percent.
“This means that every 2 or 20 minutes, your synapses are going up or down to the next size. The synapses are adjusting themselves according to the signals they receive,” said Salk scientist Tom Bartol.
“The implications of what we found are far-reaching,” said Prof. Sejnowski. “Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us.”
Scientists think the landmark research will have important applications beyond neuroscience in the future.
“This trick of the brain absolutely points to a way to design better computers,” says Sejnowski. “Using probabilistic transmission turns out to be as accurate and require much less energy for both computers and brains.”