Greening the Data Center in conflict with outsourced Cloud Computing suppliers

GigaOm has a post on green computing in the data center.

Green Computing Needs a Data Center Whisperer

By Stacey Higginbotham | Friday, November 20, 2009 | 5:05 PM PT | 0 comments | 17 tweets

As compute demand increases, demand for power in data centers is soaring. To help IT professionals halt the spread of watt-consuming servers, the industry needs to develop software that can communicate the ways in which the various layers of the data center perform and interact. They need a binary version of Cesar Millan — a data center whisperer.

Speaking at a panel held Wednesday night in Austin, Texas, several folks from the large server shops and a distinguished engineer who runs a data center for IBM spoke about the challenges of keeping power consumption down in a world where computing demand is going up. (For a truly in-depth look at this topic, check out our GigaOM Pro report — subscription required.) The panel went beyond just power and cooling (thank goodness) to focus on how companies are increasingly viewing power consumption in the data center as a whole, rather than merely as the sum of of the data center’s processors.

The IBM engineers achieved the results by creating an internal cloud computing initiative.

IBM’s Scott Winters said he saved 30 percent on his energy costs over three years while increasing his computing abilities by 50 percent and his storage by 150 percent. He did this in two primary ways: by virtualizing his data center and creating a pool of shared resources that are used on demand, and by paying attention to software he has running that tells him what’s happening on his servers.

An interesting way to tell the story of being aware of what is going on is the data center is whispering secrets.

“My data center was whispering secrets, and now I have a way to understand them,” Winters said. He said his IBM software and linking that software to the physical infrastructure helped him reach such an understanding, especially in regard to managing power consumption. It’s a strategy that HP has embraced with its products; there are also several startups pushing data center sensor networks that allow the data center’s server hardware and its physical infrastructure like the chillers and air conditioners to communicate.

The IBM expert does a good job of explaining what is next to the writer.

But as the facilities and IT infrastructure merge (the jobs of the facilities manager and the IT manager are also on a path to merge, according to members of the panel) standards are needed. The folks building the physical infrastructure typically use proprietary software in their products and sensors and getting that sensor network to talk to your servers can require a big programming effort. Once folks can manage their physical infrastructure and their hardware, the next step is to tie the physical and hardware layers to the application layer. That’s a big dream, and we’re still far off. But given the demand for computing and constraints on providing the power to meet that demand, it’s an issue that panels like the one Wednesday night will help solve.

Now, here is where the conflict occurs with an outsourced cloud computing company vs. the customer.  Whoever the outsource company is – Google, Amazon, Microsoft or IBM, their goal is to minimize their costs while maximizing revenue.  A cloud computing company wants easy to host services that are inefficient. because they make more money when the customer runs services requiring multiple server instances with more resources.  Their margins don’t improve because the customer runs a high utilization load.

So, in the zest for cloud computing, customers are signing up for suppliers who want you to be inefficient.  They’ll sell you their cloud computing infrastructure as lower cost than your own infrastructure, but is it as low as it could be?

The supplier is not going to do as was mentioned in the referenced post.

Once folks can manage their physical infrastructure and their hardware, the next step is to tie the physical and hardware layers to the application layer. That’s a big dream, and we’re still far off.

Cloud computing has its advantages, but the dominant players are going to run their own cloud computing infrastructure and tune the facilities, IT hardware and applications to be efficient – and be the low cost information suppliers.

If your goal is to be lowest cost in IT services think about your cloud computing whether you own it or rent it.