posts about or somewhat related to ‘brains’

How to Remember Anything (runtime ~20 minutes)

For those who have never seen it: a totally useful Ted Talk by science journalist Joshua Foer (who is also the founder of the absolutely awesome Atlas Obscura). He talks about covering the U.S. Memory Championships where he learned how humans can train their brains to remember a lot in a little bit of time. But more importantly, he talks about why we ought to strengthen our memory in an age when one can outsource the storage of most information to the web.

Related: Last year, Clive Thompson published a fascinating book about how technology is changing the way we think (mostly for the better). Maria Popova reviewed it on Brain Pickings, covering some of his most important observations, namely: the difference in transparency between traditional public storehouses of information (i.e.: the public library) and modern storehouses (i.e.: the web). And in this context, we wrote a bit about the perils of algorithmic curation.

If Your Brain Was a Hard Drive How Much Information Would it Hold? →

Via Slate:

In its latest taunts directed at South Korea, North Korea’s state-run media has called South Korean President Lee Myung-bak “human scum” and an “underwit with 2MB of knowledge.” How many megabytes should a human brain be able to store?

A lot more than two. Most computational neuroscientists tend to estimate human storage capacity somewhere between 10 terabytes and 100 terabytes, though the full spectrum of guesses ranges from 1 terabyte to 2.5 petabytes. (One terabyte is equal to about 1,000 gigabytes or about 1 million megabytes; a petabyte is about 1,000 terabytes.)

The math behind these estimates is fairly simple. The human brain contains roughly 100 billion neurons. Each of these neurons seems capable of making around 1,000 connections, representing about 1,000 potential synapses, which largely do the work of data storage. Multiply each of these 100 billion neurons by the approximately 1,000 connections it can make, and you get 100 trillion data points, or about 100 terabytes of information.

Neuroscientists are quick to admit that these calculations are very simplistic. First, this math assumes that each synapse stores about 1 byte of information, but this estimate may be too high or too low. Neuroscientists aren’t sure how many synapses transmit at just one strength versus at many different strengths. A synapse that transmits at only one strength can convey only one bit of information—“on” or “off,” 1 or 0. On the other hand, a synapse that can transmit at many different strengths can store several bits. Secondly, individual synapses aren’t completely independent. Sometimes it may take several synapses to convey just one piece of information. Depending on how often this is the case, the 10-to-100-terabytes estimate may be much too large. Other problems include the fact that some synapses seem to be used for processing, not storage (suggesting that the estimate may be too high), and the fact that there are support cells that might also store information (suggesting that the estimate may be too low).

Now, I don’t know about you but it’s this last bit about processing that interests me. I’m not so concerned about the total amount of data my brain can hold. Instead its access — and the speed of access — to the data.

In other words, it’s RAM, drive speed and overall CPU that my brain needs an overall boost in. That and a spell checker. — Michael