Hitachi recently announced that they would be producing a 5 TB drive in the near future (2010?). This is totally unexciting to me but what Hitchachi's Yoshiro Shiroishi said was. According to Techradar:
As for what can be stored on such disks, Hitachi's Yoshihiro Shiroishi explains, "By 2010, just two disks will suffice to provide the same storage capacity as the human brain."
In other words, a next-generation hard drive will be able to recall that trip to the seaside in 1976, but never where it left the car keys last night.
Ignoring the faulty memory comment for a moment - Where in the world did Yoshihiro come up with that capacity? How does one calculate what the human brain can store?
According to Ralph C. Merkle:
Several approximations to this number have already appeared in the literature based on "hardware" considerations (though in the case of the human brain perhaps the term "wetware" is more appropriate). One estimate of 10^20 bits is actually an early estimate (by Von Neumann in The Computer and the Brain) of all the neural impulses conducted in the brain during a lifetime. This number is almost certainly larger than the true answer. Another method is to estimate the total number of synapses, and then presume that each synapse can hold a few bits. Estimates of the number of synapses have been made in the range from 10^13 to 10^15, with corresponding estimates of memory capacity
This gives us an estimate much higher than Yoshihiro's... so what about estimates from traditional psychology without this neuron counting? After all having a certain number of synapses does not mean that they are even being used for 'memory' ... hell we're not even sure all the areas involved in memory. We're pretty sure about some, like the hippocampal cortex, but whether areas involved in processing physical stimuli - like motor areas for tool use, are used as part of the memory representation is up for debate (not for me... I know what I think - and gosh darnit I'm right!)
So here's the psychological estimate from the same source:
Landauer reviewed and quantitatively analyzed experiments by himself and others in which people were asked to read text, look at pictures, and hear words, short passages of music, sentences, and nonsense syllables. After delays ranging from minutes to days the subjects were tested to determine how much they had retained. The tests were quite sensitive--they did not merely ask "What do you remember?" but often used true/false or multiple choice questions, in which even a vague memory of the material would allow selection of the correct choice. Often, the differential abilities of a group that had been exposed to the material and another group that had not been exposed to the material were used. The difference in the scores between the two groups was used to estimate the amount actually remembered (to control for the number of correct answers an intelligent human could guess without ever having seen the material). Because experiments by many different experimenters were summarized and analyzed, the results of the analysis are fairly robust; they are insensitive to fine details or specific conditions of one or another experiment. Finally, the amount remembered was divided by the time allotted to memorization to determine the number of bits remembered per second.
The remarkable result of this work was that human beings remembered very nearly two bits per second under all the experimental conditions. Visual, verbal, musical, or whatever--two bits per second. Continued over a lifetime, this rate of memorization would produce somewhat over 10^9 bits, or a few hundred megabytes.
Hmm... that only comes out to: (10^9) bits = 119.20929 megabytes. We've had that for decades... So where's this number coming from. I know! I'll go to Yahoo Answers maybe they'll have the answer for me there.
I have not heard what science believes the human brain maximum capacity would be in those terms.
I do know they say (barring any cranial/brain trauma) that the brain retains pretty much all information of what it sees and hears.., storing up huge libraries of information. The problem with the majority of peoples is the recall ability. If we based this on our recall ability then computers would have us beat hands down!
Which brings me to this conclusion (although drifting from the primary subject a bit)..., whether a person believes in a God creator or just nature at work.., it would seem to suggest the information is in the old cranial system for a reason. I choose to believe for purposes after this life.
Memory is a very complex system. Its even being considered that some element of memory is stored in the body. Just one point in case among many others is the woman who received a heart transplant.., who never smoked in her life and led an otherwise very conservative life..., had urges to smoke, knew how to ride a motorcycle (something she never did before) and wanted to hang around Biker Bars... and some other quite strange changes in her character.
Eventually they found out the heart donor was a biker. This is a true story.., so whether science acknowledges it or not - I say proof of body-memory is in the pudding.., err..., well.., the heart anyway.
Ugh... clearly not.
Can someone direct me to something closer to what the hell Yoshi is talking about?
And you know... it's about the software anyway (for creating AI) - not the hardware. The hardware is the easy part.
My Conclusion: Yoshi needs to clarify what he's talking about because I think he's blatantly wrong and the idea of capacity as a meaningful thing when talking about the brain is a mistake.
Actually. They frankly have it a lot wrong. *Other* studies on, you know, how the damn thing works, rather than one how much capacity it might have, imply that we have something similar to what might be thought of as fractal and extrapolation storage. The first you can find stuff on for image compression. Basically, using fractal algorithms you could "compress" an image in a way that would produce 50% of the current best size, and at the same time be able to blow it up to 2-3 times the original size, without a *lot* of detail loss. You would easily be able to recognize it as the original, even if some minor blurring happened here and there. Now, that is *my* take on how you get the data "smaller". Extrapolation is actually observable in things like vision experiments, where the brain will "replace" the missing parts with what "should be there". So, combine the two and you get something that takes a drastically compressed bit of data, then "fills in" the gaps from past experience, to produce a high fidelity result, which is *impossible* with the "hardware" you are actually using.
The problem is, this compression will also be associative, which adds even "more" complexity and compression. I.e., both the "image" and the "filled in data" come from multiple related correlations and datasets, not from a file called, "This is what the rose I saw yesterday looked like." So, how exactly do you even "guess" what the storage requirements for a data storage system, which we don't even "have" on an artificial platform, really are? It might fit on my iPod... lol
Here's another shot at it: http://mradomski.wordpress.com/2008/05/14/human-brain-capacity-in-terab…
Notice the "I heard somewhere its 10TB" line. Yoshi has become an urban legend!
Dharmendra S Modha of the blue brain project has also given an estimate on how much computer capacity a simulation of the cerebral cortex of a human would need:
"The human cortex has about 22 billion neurons which is roughly a factor of 400 larger than our rat-scale model which has 55 million neurons. We used a BlueGene/L with 92 TF and 8 TB to carry out rat-scale simulations in near real-time. So, by naï¿½ve extrapolation, one would require at least a machine with a computation capacity of 36.8 PF and a memory capacity of 3.2 PB. Furthermore, assuming that there are 8,000 synapses per neuron, that neurons fire at an average rate of 1 Hz, and that each spike message can be communicated in, say, 66 Bytes. One would need an aggregate communication bandwidth of ~ 2 PBps.
Thus, even at a given complexity of synapses and neurons that we have used, scaling cortical simulations to these levels will require tremendous advances along all the three metrics: memory, communication and computation. Furthermore, power consumption and space requirements will become a major technological obstacle that must be overcome. Finally, as complexity of synapses and neurons is increased many fold, even more resources would be required. Inevitably, along with the advances in hardware, significant further innovation in software infrastructure would be required to effectively use the available hardware resources."
On Talking Robots, Henry Markram also talks about this question, but I forgot his exact answer. I think he said something like 300 times the capacity of the internet:
If there is anything to modern cognitive science, it is the recognition that nervous system, body and environment are inextricably linked, and it makes no sense to talk of the storage power of the brain, without inclusion of the complexity of the body and of the physical and social environment.
It's pretty insane that people can quantify human memory by hardware memory--it can't be done. At least not yet.
This reminds me of my mammalian physiology class when my prof was also quantifying the human brain by the binary code as well.
Hi Steve. Shelly's friend Tim here. Actually as a side project over the past year or so I have done a literature review on this very subject. Conclusion: There's very little out there, it's an open question, and even how to measure the capacity of someone's brain is merely speculative. Anytime anybody tries to model memory capacity the neuroscience community blows up in arms. My buddy and I are actually writing a tongue-in-cheek paper on the subject. I'll send it your way when we finally get it done in a month or so.
Quite simply, until you can say exactly what information is stored in the brain and how, the comparison is meaningless.
I'd like to second (or third? fourth? umpteenth...) the "this is meaningless" statement. You see a lot of these comparisons in the computer science/engineering community because a) they like to think that brains work like computers, and b) they like to think that what they do has any significance beyond a thinly-veiled penis envy (e.g. "my hard drive is larger than yours").
It's the age-old false question problem: You can't measure the 'capacity' of human memory, because 'capacity' isn't a term that relates to human memory. It's the same way you can't say "This hard drive has more capacity than a tree." Both statements are equally ridiculous, but one seems more obviously wrong than the other. The ONLY problem here is that engineers and psychologists have given the same label (memory) to two different things. Of course, that's a less interesting problem....
So.. um... the storage capacity of human memory is whatever a computer science person says his hard drive is, plus 10%. Ok? Not good enough? How about: the storage capacity of human memory is purple.
"The storage capacity of human memory is an asshole."
"The storage capacity of human memory doesn't care about your feelings."
"The storage capacity of human memory slept with your mom."
And so on. They're all equally accurate.
There seems to be way too much conflation here between the way a computer works and the amount a hard drive can store. I shouldn't have to remind anyone here that these aren't the same thing.
If we could figure out not how the brain processes thought, but what it can store and recall, then there's room for a comparison between it and a hard drive. Unless, of course, the way a brain processes thought is inextricably entwined with how it stores memory!
Otherwise, you're right, it is meaningless.
Everyone is trying to estimate our memory capacity in PC terms, but they're all forgetting something: we're not machines! Has anyone ever considered the possibility that our 'storage capacity' might be flexible, and that we're adapting it to our needs every single day?
Hey Steve, my article on memory encoding finally came out! You can go to
to download the original version we wanted (all black and white, all illustrations hand-drawn), and the editor version (with color)
Cheers! Hope you enjoy it!
I'm no expert, but I think the human brain is far more complex than you are making it out to be. That is only the consciously stored memory which you are calculating, and at this time really the only form that can be calculated, but consider all your subconscious memories, the random faces you have seen before, and remember their structure. Language capabilities, recognition of sound, and muscle movements. As well as the brain stem, which controls the majority of your vital functions you don't even think about. Consider the DNA in each cell, and the amount of information encrypted on it, used to make RNA, then to synthesize proteins, and then enzymes for cell directives. Hormone synthesis, Instinct, Cell replication. I am also a serious doubter of the 10% of our brain theory, instead I believe we use only 10% of our conscious brain. This would calculate 120MB, which i still think is far off as 10% of our conscious brain. Meaning that our conscious brain would store approximately 1.2GB as your conscious brain capacity. I still believe this is far off. It would require Terabytes,upon Terabytes, if not Hundreds of Terabytes to carry out life processes along with all your thoughts. Plus we're not electric equipment.
This whole blasted discussion reminds me of the advent of floating point notation. It allowed greater range of stored numerals for quicker calculation, at the expense of slightly reduced accuracy.
Perhaps a similar change in the storage paradigm is in the works: insane compression algorithms that use tiny bits of memory, "at the expense of some accuracy", that processing can "fill in".
Sort of like the "associative compression" described above.
Thanks for this post guys, it inspired me to gel some of my thoughts. I have refrained from dumping my entire post here and will try to just summarise the main points.
Is it possible that the brain stores data in a quantum-like manner? Hashing data on not only to neurons, but via the initiation path as well. Making data referencing and storage effectively limitless. The logical behind it from a systemic point of view seems feasible enough.
Could this just be an example of how biological evolution had to solve the storing masses of data problems that it surely encountered along the way, arguably millennia ago?
I think these are fairly interesting questions that are relevant to both neurology and technology. Regarding humans not being machines or electrical equipment, this distinction is somewhat semantic. From a functional point of view we are basically a type of machine or many little machines. We think of machines is such a limited sense, we attribute them with metal and electricity and disks, however from a functional perspective all life from the cell structure to the populations, are all in effect machines. Just perhaps biological ones. Further to that at the level being discussed here, electric equipment is very much a part of this too, indeed it is one of the major factors in both fields, neuroscience and technology. There are some similarities and it is not thererfore entirely unprobable that perhaps we are just starting to scratch the surface of the amount of data that a brain deals with now. This has forced us to look for ways in which to manage data. Perhaps our recently acquired data management and transmission techniques which have resulted in a somewhat organic entity (again from a functional point of view) could possibly give us some insight into any neural algorithm keys and how indeed they develop.
I will leave it at that, please feel free to read further on my blog post http://www.tinyurl.com/mlf8am
Once again, thanks for the spark.
Just because the human brain doesn't use the same binary infrastructure as modern computers does not mean that any attempt to quantify its capacity is meaningless. In my opinion, to say that is the same as saying, "it's infinite". And, of course, its not.
What makes it meaningless (at the moment) is our lack of understanding of the type of "compression" and the way the data is stored. Once we can bridge the gap then these things will have meaning.
So, right now it is sort of meaningless but later on it may not be.
I think you guys would be surprised at the similarities between the human brain and modern devices and technology. Yes, they are very different but not as different as some believe. Go google neurological advances and different neurological things scientists have been working on. I was surprised that we were this far along.