Growth in Electricity Use at Data Centers Slows, Study Says

The rapid growth in data-center electricity use that prevailed from 2000 to 2005 slowed significantly from 2005 to 2010, according to a new report authored by Jonathan Koomey, a consulting professor at Stanford University. Still, data centers accounted for about 1.3 percent of all global electricity use in 2010, and about two percent of all electricity use in the United States.

Koomey attributed the slowdown in part to the introduction and improvement of virtualization technologies as well as the industry's intense focus on server efficiency, beginning in 2006. However, the biggest contributor seems to have been the 2008 financial crisis and its accompanying economic slowdown, which led to significant reductions in the installed base of servers in data centers worldwide in comparison with earlier estimates by IDC and the Environmental Protection Agency.

In the U.S., for example, the electricity used by data centers from 2005 to 2010 increased about 36 percent instead of doubling. Similarly, worldwide electricity consumption by data centers increased about 56 percent from 2005 to 2010 instead of doubling. "It would appear that the financial crisis had a larger effect on the U.S. data-center market compared to earlier expectations than it did on the world market," Koomey observed.

One Wild Card

Servers represent the largest and most important electricity consumer in data centers. However, Koomey's model also includes electricity consumption for data storage, communications and infrastructure equipment such as cooling, fans, pumps and losses in the backup and power-distribution systems.

One wild card for which Koomey was unable to account is the increasing prevalence of "self-assembled" servers used in data centers at large organizations such as Google. "The company assembles its own custom servers -- and has been doing so for a long time," Koomey noted.

His educated guess is that Google's servers represent about 0.01 percent of electricity use worldwide and less than one percent of all data-center electricity use in 2010 on a global basis.

"This result is in part a function of the higher infrastructure efficiency of Google's facilities compared to in-house data centers, which is consistent with efficiencies of other cloud-computing installations," Koomey explained. "But it also reflects lower electricity use per server for Google's highly optimized servers."

Slower Growth Ahead

Koomey noted that predicting future electricity use by data centers is a complicated task because server designs change over time. Though today's servers "have much higher processing power, more memory, faster network connections, more components, and bigger power supplies, [they also] have power management and other clever technologies to reduce electricity consumption," Koomey explained.

Peak power per server -- which seems to be increasing -- may also be diverging from annual electricity use per server, which is growing more slowly than peak power and may actually be declining. Moreover, the IDC server forecast on which Koomey's study is based shows virtually no growth in the installed server base from 2010 to 2013, which would mean slower growth in data-center electricity use if correct.

The ongoing migration to cloud-computing environments -- which typically achieve higher server utilization levels and infrastructure efficiencies than in-house data centers -- should also help lower electricity use, Koomey predicted. Meanwhile, one way large organizations can reduce their data-center electricity bills is to decommission servers that are no longer relevant to their IT operations.

"Anecdotal evidence indicates that 10 percent to 30 percent of servers in many data centers are using electricity but no longer delivering computing services," Koomey observed. "In many facilities, nobody even knows these servers exist."