Back in 2007, the Environmental Protection Agency (EPA) submitted a report (PDF) to Congress noting that energy usage at data centers had doubled between 2000 and 2006, and warned it would double again by 2011, due in part to the growing use of the Internet. The federal government would then be on the hook for $740 million a year just to pay for electricity for these servers and data centers. Well, here we are in the second half of 2011, and it appears the EPA was wrong.
In a new independent report on data center power use from 2005 to 2010 by Jonathan G. Koomey, a consulting profession in the civil and environmental engineering department at Stanford University, it was found that the number of servers was much lower than anticipated, the result of lower than expected demand for computing, The New York Times reports. The financial crisis of 2008 and new technologies, including more efficient processors and server virtualization, helped reduce the need for more servers.
According to the report, electricity use by data centers increased about 56 percent from 2005 to 2010 rather than doubling up. In the U.S., electricity use increased by about 36 percent. To put those numbers into perspective, electricity used in global data centers in 2010 "likely accounted for between 1.1 percent and 1.5 percent of total electricity use," and between 1.7 percent and 2.2 percent in the U.S.
Wondering about Google? Koomey says the sultan of search consumed less than 1 percent of electricity used by data centers worldwide.
Image Credit: Wikimedia Commons