OREANDA-NEWS. February 10, 2016. The next time you forget an appointment, misplace your keys or can’t recall someone’s name, you’re going to have a hard time cooking up an excuse: A new study says your brain’s memory capacity is a petabyte — 10 times more than previously thought.

That’s enough to hold more than 13 years of HD-TV recordings, or the DNA of the U.S. population, twice over.

The research by the Salk Institute for Biological Studies and collaborators also sheds light on why the brain is so energy efficient, and could lead to computers that combine enormous processing power with low energy consumption.

“This is a bombshell, not just in neuroscience, but it has important implications for computer science as well,” said Terry Sejnowski, a Salk Institute professor and co-senior author of a paper describing the research, which was published in the journal eLife. “It’s giving us a whole new perspective on how the brain functions.”

For Synapses, Size Matters

The researchers determined the brain’s memory capacity by studying the synapses, the connections that carry signals between brain cells, or neurons. When it comes to synapses, bigger is better: synapse size determines the memory capacity of neurons. Larger synapses have other advantages, too. They’re stronger and carry signals more reliably.

The team created a highly detailed 3D digital reconstruction of tissue from a rat’s hippocampus, the memory center of the brain. The reconstruction, powered by GeForce GTX TITAN GPUs, was the “highest accuracy ever attempted,” said Sejnowski, who is a pioneer in computational neuroscience.

The model and an electron microscope allowed the team to measure the size differences of certain synapses, which led them to identify about 26 synapse size categories, with roughly an eight percent size difference between each.

In computer terms, that’s about of 4.7 bits of information per synapse. Previously, neuroscientists believed hippocampus synapses could only hold one or two bits each.

More Powerful, Energy-Efficient Computers Possible

But knowing the brain’s enormous storage capacity led to a head-scratcher. Synapses transmit messages between neurons only 10-20 percent of the time, yet the brain manages to function efficiently.

“There’s always been this question that, if the synapses are so unreliable, how can the brain get any computation done?” asked Tom Bartol, a Salk staff scientist and an author on the paper.

One possible answer: The synapses constantly adjust their sizes in response to the signals they receive, and then average out their success and failure rates over time.

Brain Uses as Much Power as a Dim Bulb

What Sejnowski calls the brain’s “probabilistic” strategy could explain its surprising energy efficiency. Because the synapses are active only 10-20 percent of the time, the rest of the time the brain is able to conserve energy. A waking adult brain consumes about 20 watts of power, equal to a dim light bulb.

The study’s findings could lead to computers modeled on the brain’s imperfect but successful system, which “turns out to be as accurate and require much less energy for both computers and brains,” Sejnowski said.

These computers would be ideal for deep learning and artificial neural networks, which require enormous amounts of efficient processing power for tasks like image analysis or speech recognition.