Pillar to Post: Peter Welander's Blog

Peter WelanderNews and comment from Control Engineering process industries editor, Peter Welander

See all Pillar to Post blogs and comments

Single Blog

The cloud’s black lining

Server farms and data centers exposed as terrible energy wasters and polluters. Where is the efficiency strategy?

September 24, 2012

How long would your company survive if your energy usage efficiency was less than 20%? How about 10%? In our high-tech world, that is the reality of most data centers. Your ability to send an email or read this article depends on thousands of data centers around the world, constantly consuming around 30 MW of power off the grid. The problem is that most of those data centers are terribly inefficient.

This news comes from the New York Times in a series of articles that started yesterday with part one, examining the infrastructure of the Web. (You may get caught in the paywall. It’s worth the 99¢. I’ve been reading it the old-fashioned way.) Here’s the second installment.

One conclusion of the very extensive story is that data centers “can waste 90% or more of the electricity they pull off the grid.” The specific figure for any given site can be a pretty wide range, but this is the industry average. The main reason suggested is that companies that maintain this infrastructure are paranoid about service outages. Consumers can get pretty testy if some Web-based service is not available 100% of the time. The result is that these facilities are built with ridiculous amounts of overcapacity and elaborate back-up power capabilities. Those servers run whether they’re doing anything or not, and most of them aren’t. Moreover, those servers generate heat which has to be dissipated. It all adds up.

To make matters worse, there have been reports of gross violations of local permitting and pollution regulations involved with Diesel generator installations. The story cites six-figure fines that Amazon received in Virginia, and they’re not alone.

The reason for this is the huge amount of data we’re creating these days. Estimates in the article suggest that the world generated 1.8 trillion GB of digital information last year. That number is climbing, and 75% apparently comes from consumers. It’s mind boggling. Must 20 GB of my vacation video occupy space on a server somewhere? (It doesn’t.)

This whole discussion leaves one wondering where the automation companies are that would find this situation utterly intolerable in any other industrial context. Hasn’t it occurred to some of the smarter people at Google, Facebook, Amazon, and other Web giants that this is a huge cost? Surely some of our industry providers could deploy technology capable of modulating server capacity to follow demand, and sensible consumers would realize that half the people in the U.S. can’t all watch a Call Me Maybe video on YouTube at the same time.

You can follow the additions to the story this week at www.nytimes.com/national.

Peter Welander, pwelander(at)cfemedia.com