# How much information is there in the world? Scientists calculate the world's total technological capacity

(PhysOrg.com) -- Think you're overloaded with information? Not even close. A study appearing on Feb. 10 in Science Express calculates the world's total technological capacity -- how much information humankind is able to store, communicate and compute.

"We live in a world where economies, political freedom and cultural growth increasingly depend on our technological capabilities," said lead author Martin Hilbert of the USC Annenberg School for Communication & Journalism. "This is the first time-series study to quantify humankind's ability to handle ."

So how much information is there in the world? How much has it grown?

Prepare for some big numbers:

• Looking at both digital memory and analog devices, the researchers calculate that humankind is able to store at least 295 exabytes of information. (Yes, that's a number with 20 zeroes in it.)

Put another way, if a single star is a bit of information, that's a galaxy of information for every person in the world. That's 315 times the number of grains of sand in the world. But it's still less than one percent of the information that is stored in all the DNA molecules of a human being.

• 2002 could be considered the beginning of the digital age, the first year worldwide digital storage capacity overtook total analog capacity. As of 2007, almost 94 percent of our memory is in digital form.

• In 2007, humankind successfully sent 1.9 zettabytes of information through broadcast technology such as televisions and GPS. That's equivalent to every person in the world reading 174 newspapers every day.

• On two-way communications technology, such as cell phones, humankind shared 65 exabytes of information through telecommunications in 2007, the equivalent of every person in the world communicating the contents of six newspapers every day.

• In 2007, all the general-purpose computers in the world computed 6.4 x 10^18 instructions per second, in the same general order of magnitude as the number of nerve impulses executed by a single human brain. Doing these instructions by hand would take 2,200 times the period since the Big Bang.

• From 1986 to 2007, the period of time examined in the study, worldwide computing capacity grew 58 percent a year, ten times faster than the United States' GDP.

• Telecommunications grew 28 percent annually, and storage capacity grew 23 percent a year.

"These numbers are impressive, but still miniscule compared to the order of magnitude at which nature handles information" Hilbert said. "Compared to nature, we are but humble apprentices. However, while the natural world is mind-boggling in its size, it remains fairly constant. In contrast, the world's technological information processing capacities are growing at exponential rates."

Explore further

United States becomes world leader in wind power

More information: M. Hilbert and P. Lopez, "The world's technological capacity to store, communicate and compute information," Science Express: Feb. 10, 2011.

Feedback to editors

Feb 10, 2011
Calculating how many bytes each portion of information consumes is a tricky business. For example, a 1-minute TV data stream has more bytes than a twenty-page Wikipedia article with pictures--but what has more information?

Feb 10, 2011
JRDarby: Yes, there is a very important distinction between data, information and knowledge. Data is a record of related quantities. Information is organised data. Knowledge is meaningfully interpreted information.

------------------------------------------------

I think we are on the cusp of a technological and social change far greater in scope and impact than the renaissance or the industrial revolution.

Feb 10, 2011
Calculating how many bytes each portion of information consumes is a tricky business. For example, a 1-minute TV data stream has more bytes than a twenty-page Wikipedia article with pictures--but what has more information?

I agree with you completely, the comparisons are apples and oranges. Comparing the bit rate of an audio stream to a newspaper text file is ridiculous. The phone conversation holds more data per se, but information wise, the newspaper holds much more - even more so when you calculate 6 newspapers a day.

It sounds like the researcher got overwhelmed trying to calculate the different types of data and information, and basically cheated by using bad shortcuts. This may possibly be useful as data comparison, but more than likely, it is flat wrong as information comparison.

Feb 10, 2011
FYI: the term "information" used in the article above, has a very specific technical meaning (look up "Information Theory"), which is quite distinct from the colloquial meaning of the word.

Feb 10, 2011
Totally skewed results- they forgot to include this study in the calculations.

Feb 11, 2011
It's probably better to measure the information velocity or acceleration. Calculating a snapshot is of limited use. Perhaps you use it to fit a curve when you have multiple snapshots. Either way it will be an exponential most likely conforming to Moores Law.

Feb 11, 2011
I woiuld like to see an estimate of biological computing capacity for Earth, and when our digital computing capacity will exceed it. I suspect it is around 2040

Feb 11, 2011
Definition of a nerd. Easily impressed by big numbers. "Awesome. Look at those huge exabytes!"

Feb 11, 2011
I would like to see an estimate of biological computing capacity for Earth, and when our digital computing capacity will exceed it. I suspect it is around 2040

Doubling trend suggests ~2050.

Tianhe-1A, our fastest supercomputer is already at the lower bound for single human brain computational capacity.

Feb 11, 2011
Too bad we, as a species, seem to be oversaturating our information handling ability, and knowledge, in the process, gets diluted. If you ask my daughter, glued to her 'smart'phone, about it, she won't hear you...

Feb 11, 2011
According to Ray Kurzweil, our destiny is that every molecule in our universe is to be used as information.

Feb 11, 2011
I would like to see an estimate of biological computing capacity for Earth, and when our digital computing capacity will exceed it. I suspect it is around 2040

Doubling trend suggests ~2050.

Tianhe-1A, our fastest supercomputer is already at the lower bound for single human brain computational capacity.

You are so far off it's not funny. A human brain does about 100 petaflops, while tianhe-1a does 2.5 Petaflops. Thats almost two orders of magnitude difference.

Stop spewing off information that you have not researched or know anything about.

As for total biological parity, information, especially in the biological sense is so semantic to make the excercise asinine. Consider that this researcher would take every strand of DNA as information and DNA replication or protein synthesis as computing, and you understand how absurdly large that number would be.

Wont let me post supporting links so search petaflops brain/tianhe-1a for documentation.

Feb 12, 2011
You are so far off it's not funny. A human brain does about 100 petaflops, while tianhe-1a does 2.5 Petaflops. Thats almost two orders of magnitude difference.

Depending on how the complexity is measured, it can vary between 10^19 and 10^15 IPS. Those are estimates for brain simulation. That's why I said lower bound.