When Moore first predicted back in 1965 that computing power would double every 18 months, it’s hard to imagine even he could have predicted that by 2008, computers would be crunching a whopping 9,570,000,000,000,000,000,000 bytes of data per year. To put it in more modern terms that 9.57 zettabytes, or a million million gigabytes in total. The good news behind these numbers is that the vast majority of this data is a byproduct of CPU’s crunching numbers, and is not actual human readable information. Even despite these caveats however, 9.57 zettabytes is still a staggering number to wrap our minds around.
If you assume an average sized book is about 4.7 centimeters thick, and contains about 2.5 megabytes of information, 9.57 zettabytes would create a stack 5.6 billion miles high – enough to stretch all the way to Neptune and back again twenty times. Since these numbers are based on data from 2008, it’s safe to say we can probably add a few more laps based on today’s modern CPU’s, but when your dealing with numbers this high why split hairs over details.
By 2024, the researchers involved in the study expect this theoretical pile of books to be large enough to reach the next star, Alpha Centauri, 4.37 light years away.