Evil Side of Benchmarks and Metrics – Why laptop batteries don’t last as long as claims

Newsweek has an article about why laptops batteries don’t last as long as manufacturer’s claims. The article starts with the problem of who defines the metrics for laptop batteries.

Daniel Lyons

Hurry Up and Type

Why your laptop runs out of juice so fast.

Published Jun 18, 2009

From the magazine issue dated Jun 29, 2009

Imagine if automakers got together and started measuring the gas mileage of new cars with a cool test of their own making—one in which the cars were rolling downhill with their engines idling. Suddenly you'd have some pretty amazing claims: Why, that three-ton SUV gets 300 miles per gallon! This subcompact gets 500! In tiny print at the bottom of the window sticker you'd find a disclaimer saying that, well, um, you know, your mileage may vary.

Crazy, right? Yet that's more or less what's happening with laptop computers and their battery lives. Right now, I'm looking at a Best Buy flier touting a $599 Dell laptop that gets "up to 5 hours and 40 minutes of battery life." Down in the fine print comes a disclaimer explaining that "battery life will vary" based on a bunch of factors. Translation: you ain't gonna get five hours and 40 minutes, bub. Not ever. Not even close.

The specific benchmark referenced is the MobileMark 2007.

So how can Dell and Best Buy make that claim? These battery-life numbers are based on a benchmark test called MobileMark 2007 (MM07). The test was created by a consortium called BAPCo (Business Application Performance Corp.), whose members are—you guessed it—computer makers and other tech companies. AMD, the No. 2 maker of microprocessors, is a member of BAPCo, but now has become a whistle-blower. AMD says PC makers know full well that the new tests produce misleading numbers, but they are touting them anyway.

Any experienced Technology person has learned to ignore most benchmarks as they are easy to be gamed, and there is requirement for the vendors to be transparent in how they achieve their results.

So, what about energy efficiency and power saving claims from server and data center vendors. Many are taking advantage of the lack of benchmarks for power saving claims.  Who has the capability to measure the range of vendors?

And, even if there is a benchmark how do users know how the tests were performed.  in this economy, it is easy for executives to make the decision and take the risk to overstate efficiency as customers look for ways to save money.

What do you do?  Ask for more information on how performance tests were run. Can you get a copy of the test results? What is the range of expected results? What factors most influence the results? How close do the test conditions match your conditions?

Benchmarks and metrics are good, but are too easy to be used to overstate claims without an auditing infrastructure.  Watch out for metrics without 3rd party validation.