PDA

View Full Version : Just how much information is on the internet?


leif_erikson
Jul 31, 2010, 06:34 AM
What's the estimated number of information currently on the internet. (in byte measures) Is it yet in the zettabytes? I got this estimate by assuming that every single person in the world requires 1 TB of information and then rounding up to the next SI prefix. I bet that this is a huge underestimation because the CIA and other intelligence agencies must have 1000 times more information per person who works there. (however, I doubt that this information would be connected to the internet that everybody uses)

Also, does anybody know the rate function R(t) at which the internet grows per year. (assuming that nothing ever gets deleted) I'm trying to design a cost-effective and efficient server/ hard drive that can download all of the world's information at breakneck speed. (of course there will be an upper limit to this speed since no information can travel faster than the speed of light)

This is a big challenge and I'm ready to take it on. History must be recorded and all information must be preserved in some form. Books and paper documents can take up a lot of physical space. Using computer technology seems to be the best library right now. (well at least the theory of how it works)

P.S. No, I'm not crazy. I enjoy any challenge in math, theoretical physics or engineering. (all types)

Curlyben
Jul 31, 2010, 09:59 AM
It is IMPOSSIBLE to even begin to make a reliable estimate on the volume of data on the net.
Google's own Datacentres hold pecabytes of data couple that with Microsoft's, Yahoo! AOL, the list goes on and on and on...

Bear in mind there's over 240 MILLION sites on the net, according to netcraft in Oct 2009, so I think you are now starting to understand the vastness of the problem.

Wondergirl
Jul 31, 2010, 10:17 AM
According to the annual survey of the global digital output by International Data Corporation, the current total amount of global data is expected to pass 1.2 zettabytes sometime during 2010.

From Zettabyte - Wikipedia, the free encyclopedia (http://en.wikipedia.org/wiki/Zettabyte)

leif_erikson
Jul 31, 2010, 08:47 PM
Thank you. Well, I guess that I'll start with the yottabytes to be on the safe side. As for the rate function R(t), I'm still investigating it and it's a very long and painstaking process. As for the server/ hard drive, I'm tapping into the field of QCD and electroweak theory as a means of engineering this. If there are as many transistors as there are nuclei in a material, it may just be possible. The only problem is that the strong interaction is not yet fully understood by physicists for any much practical applications. I'm going into research on grand unified theories and astroparticle biophysics. It's for studying the evolution of life on or around astronomical bodies, (including planets like Earth) in the universe based on the laws of physics that govern it.

As for the rate function, I can use it to approximate exactly how much information is on the internet as time goes by. Integrating R(t) from 2010 to "t" and then adding 1.2 yottabytes can give the approximate amount of information that will be expected to be on the internet in the future as a function of time "I(t)" which is defined for 2010< t < some later date. But this is still tough because R(t) doesn't necessarily have to be a consistent function of time like in physical laws. It varies with the decision-making of economists, politicians, scientists, engineers and the general public demand. Only an approximate rate function can be determined and it will have to be checked on a case-by case basis.