MIT’s Technology Review has an article on an Internet-routing algorithm that adapts to energy prices.
Energy-Aware Internet Routing
Software that tracks electricity prices could slash energy costs for big online businesses.
By Will Knight
MONDAY, AUGUST 17, 2009
- An Internet-routing algorithm that tracks electricity price fluctuations could save data-hungry companies such as Google, Microsoft, and Amazon millions of dollars each year in electricity costs. A study from researchers at MIT, Carnegie Mellon University, and the networking company Akamai suggests that such Internet businesses could reduce their energy use by as much as 40 percent by rerouting data to locations where electricity prices are lowest on a particular day.
Data beast: Google maintains a huge datacenter in The Dalles, OR.
Credit: John NelsonModern datacenters gobble up huge amounts of electricity and usage is increasing at a rapid pace. Energy consumption has accelerated as applications move from desktop computers to the Internet and as information gets transferred from ordinary computers to distributed "cloud" computing services. For the world's biggest information-technology firms, this means spending upwards of $30 million on electricity every year, by modest estimates.
The researchers worked with Akamai to test their ideas.
Asfandyar Qureshi, a PhD student at MIT, first outlined the idea of a smart routing algorithm that would track electricity prices to reduce costs in a paper presented in October 2008. This year, Qureshi and colleagues approached researchers at Akamai to obtain the real-world routing data needed to test the idea. Akamai's distributed servers cache information on behalf of many large Web sites across the US and abroad, and process some 275 billion requests per day; while the company does not require many large datacenters itself, its traffic data provides a way to model the demand placed on large Internet companies.
The researchers first analyzed 39 months of electricity price data collected for 29 major US cities. Energy prices fluctuate for a variety of reasons, including seasonal changes in supply, fuel price hikes, and changes in consumer demand, and the researchers saw a surprising amount of volatility, even among geographically close locations.
The interesting insight is there was no site that was always cheapest.
"The thing that surprised me most was that there was no one place that was always cheapest," says Bruce Maggs, vice president of research at Akamai, who contributed to the project while working as a professor at Carnegie Mellon and is currently a professor at Duke University. "There are large fluctuations on a short timescale."
Keep in mind this is cost reduction, not energy reduction.
Maggs cautions that the idea is not guaranteed to reduce energy usage or pollution, only energy costs. "The paper is not about saving energy but about saving cost, although there are some ways to do both," he says. "You have to hope that those are aligned."
And they reached out to Digital Realty Trust’s Mike Manos to get his view.
Michael Manos, senior vice president of Digital Realty Trust, a company that designs, builds, and manages large datacenters, believes that the lack of elasticity currently built into modern hardware makes it impossible to achieve the improvements suggested.
"It is great research but there are some base fundamental problems with the initial assumptions, which would prevent the type of savings they present," Manos says. Because most servers aren't used to capacity, he says, "you just can't get there."
However, Manos does see plenty of room for improvement in datacenter designs. "I believe the datacenter industry is just beginning to enter into a Renaissance of sorts," he says. "Technology, economic factors, and a new breed of datacenter managers are forcing change into the industry. It's a great time to be involved."