From HumanBrain:
What is an estimate of the capacity of the brain? GBytes, TBytes ...? I gather the processing speed is not that fast (on the order of milliseconds) but that does not seem to matter because it is an AssociativeMemory overlaid on a NeuralNetwork. If one could see the DeepClassHierarchies, FatClassHierarchies or MindMaps of TheRealBrain? how many objects would there be? I realize the MetaLanguage of TheRealBrain? may not be isomorphic to OO DataStructures but assuming there is a mapping to Classes/Objects, then roughly how many? For instance if you wrote a class to calculate tennis scores presumably the structures in the brain that do the same thing would use not use many more bytes than say a C++ Class (compiled?).
... the processing capacity of the brain has not been reliably determined. But a fair estimate is that the 1.5 kilogram organ has 10^10 neurons, [each] with 10^3 synapses firing an average 10 times per second, which is about 10^14 bits/second. Using 64-bit words like the largest supercomputers, that's about one teraflop. -- Robert A Freitas, "The Future of Computers", _Analog_, March 1996.
Anybody want to do the math for storage over a lifetime? Assume that people remember back as far as 3 years old (I can remember earlier), and assume no organic defect leading to memory loss.
Also, assume 60 frames per second during waking hours, video, audio, smell, temperature, motion, pain/pleasure, emotional tone, learned facts, analytical thought, and so on.
(We're not addressing software storage, the source of ideas and creativity and whatnot, just data.)
Bah. The brain stores in vector, not raster
Oh, good, then a few GB should do the trick ... ?
Unless you factor in all the porn people have seen. . . Then maybe a few PB
Seems to me that the brain is mostly a cache. That explains why memorization works. Repeat something to yourself over and over, and you can run up a lot of cache hits; you fool the brain into thinking that the data is important and should be cached longer. Also explains how you can easily forget how to do something that you haven't done in a long time.
From SevenPlusOrMinusTwo:
Every part of the brain serves one or more purposes.
So in GBytes what is a rough capacity of the brain for memory+cognitive processing? If you could do a df -h on a typical Ph D what % would be free? No slights against post-graduates please. And why should short-term memory be so miniscule in comparison? Best case 9 "chunks" must be less than a KByte.
Excellent question. The pat answer is "5-9 chunks must be enough." That, I think, is the path to the answer: Figure out what we're actually doing with our STMs and we'll be able to see that 5-9 is enough. The "brains are like computers" school of thought suggests that STM is analogous to computer registers: That's where all the action takes place, and there are not many registers present or needed in the generalized von Neumann architecture. But the brain isn't von Neumann architecture, so why so few? My personal hypothesis is that STM is an artifact of consciousness and is a focus area, and that's why there's so few; a consciousness needs no more than 5-9 chunks to perform its main function, which is examination of series of events for relevance and success probability. Oh, and of course: the limit both forces and is abetted by the highly useful generalization/abstraction capability of the intellect; once you get more than the limit, you've got to invent a type or class to reason about, which is something we do so deftly it's (almost) an unconscious skill. - LaurencePhillips
So in GBytes what is a rough capacity of the brain ... ?
One of the difficulties in measuring this is that the human brain excels at data compression, but unfortunately it is usually lossy data compression.
We know it the capacity is at least what people have managed to store in it:
An even better question would be "So what are good ways to use all that capacity?".
Big brains in animals is a curious subject. Animals still have a wide variety of brain sizes after almost a billion years of evolution. But the larger end of he spectrum has grown over time. This is also consistent with "complexity", as measured in terms of different "cell types" (estimated in fossils). The high-end of the complexity spectrum increases even though simple and mid-level animals remain.
But as far as brain size, why didn't early animals simply grow a big brain as an advantage? The Cambrian Anomalocaris had a big body that could support a big brain. The main problem appears to be that big brains consume a lot of energy. The advantage of a big brain has to compensate for the extra energy it needs.
The human brain may be biological fluke, an anomaly similar to the peacock's tail. It's hugely expensive to maintain and puts the animal at risk. But it exists as a mating practice. Perhaps human brains are similar in that cognitive skills, and related social skills, become our version of the peacock tail. Women were attracted to the more clever guy, putting upward pressure on brains.
Or it could be that our social structure and hunting technology allowed us to have smaller and wimpier bodies so that more food could be devoted to the brain. You don't need huge muscles if you can use a spear effectively with a team of hunters.
Another theory is that ability to learn multiple languages allowed trade to improve among tribes, and the better a tribe was at learning new languages, the more they could trade, increasing mutual wealth. Your area has good spear wood, but not good rocks for points. But you trade with another tribe that has better access to rocks, but poor wood.
Social complexity was also slow to evolve. Most early animals fended for themselves mostly alone or in loose-knit groups.
--top
See also: MyBrainIsFull