Google shares its data center cooling best practice - water and hot aisle containment "hot huts"

Google has an end user friendly explanation of its data center cooling.

Our emphasis on cooling systems might come as a surprise, until you consider how warm a personal computer can become during use. Data centers, which house thousands of computers, need to stay within a specific operating temperature range. Even though we run our facilities hotter than a typical data center, we need cooling systems - both to prevent server breakdowns and to provide a reasonable working environment for technicians working on the data center floor.

After servers, the second largest consumer of power in a data center is the cooling system. We needed a cooling system which minimized our overall energy consumption. For this reason, we designed our own cooling systems from the ground up. 

The interior of a hot hut row

Google uses hot aisle containment (hot huts) creating a higher delta T  across the water cooling coils at the top of the hot huts.

IBM has used water cooling in its supercomputers for years and even used the waste heat for heating homes.

SuperMUC combines its hot-water cooling capability, which removes heat 4,000 times more efficiently than air, with 18,000 energy-efficient Intel Xeon processors. In addition to helping with scientific discovery, the integration of hot-water cooling and IBM application-oriented, dynamic systems management software, allows energy to be captured and reused to heat the buildings during the Winter on the sprawling Leibniz Supercomputing Centre Campus, for savings of one million Euros ($1.25 million USD) per year.

Now for those of you who think Google should use its waste heat to heat homes, there is the problem that Google data centers are not close to residential or commercial businesses that can use the low grade heat.

In some data centers there is a hard fast rule of keeping water out of the data center, but if you want to be the most efficient you need to break some rules.