Tech manufacturers continue misleading consumers with impressive-sounding but less useful specs like milliamp-hours and megahertz, while hiding the one measurement that matters most: watts. The Verge argues that the watt provides the clearest picture of a device’s true capabilities by showing how much power courses through chips and how quickly batteries drain. With elementary math, consumers could easily calculate battery life by dividing watt-hours by power consumption. The Verge:
The Steam Deck gaming handheld is my go-to example of how handy watts can be. With a 15-watt maximum processor wattage and up to 9 watts of overhead for other components, a strenuous game drains its 49Wh battery in roughly two hours flat. My eight-year-old can do that math: 15 plus 9 is 24, and 24 times 2 is 48. You can fit two hour-long 24-watt sessions into 48Wh, and because you have 49Wh, you’re almost sure to get it.
With the least strenuous games, I’ll sometimes see my Steam Deck draining the battery at a speed of just 6 watts – which means I can get eight hours of gameplay because 6 watts times 8 hours is 48Wh, with 1Wh remaining in the 49Wh battery.
Unlike megahertz, wattage also indicates sustained performance capability, revealing whether a processor can maintain high speeds or will throttle due to thermal constraints. Watts is also already familiar to consumers through light bulbs and power bills, but manufacturers persist with less transparent metrics that make direct comparisons difficult.
Powerbanks are where it’s most problematic. They’re usually reporting the capacity of the battery cells in mAh. Those cells will be at 2.8-4.2V during operation, but the powerbank outputs 5V, or in modern powerbanks some higher number. 5000 mAh at the 3.6V average of the cells during discharge is certainly not 5000 mAh at the 9V it’s giving to my phone.
It’s not going to give my phone 2000 mAh @ 9V or 18 Wh as the math would suggest either because it’s well below 100% efficient. I’m not sure what’s reasonable to demand in terms of advertising here since efficiency will vary with output voltage and output wattage.
Ideally they would provide a graph showing how many watt hours it will output from minimum to maximum load. If it supports multiple voltages, there should be a separate line in the graph for each voltage.
Yeah, and my newer powerbanks all do PowerDelivery for 5, 9, 12, and 20 V.
I’m assuming watt-hours would be universal for them all (watts are watts, as the saying goes).
Well… sort of.
Batteries perform differently under load. A battery that delivers 10Wh under a 1W load will probably deliver less (and get warmer) under a 10W load. Power supplies also perform differently under load, and DC-DC switching power supplies perform differently based on the output voltage. Generally, a larger voltage conversion and/or a higher load is less efficient. There’s also going to be some base power consumption in the circuit, so the most output power is probably achieved at some sort of medium load.
To make things more fun, batteries are usually tested under constant current, not constant power. The increasing current as the battery drains of a constant power load will result in less total power, and constant output power often means increasing input power as the battery drains.
In short, the real world is complicated. Giving best and worst case Watt-hours could be a reasonable approach.
I don’t understand why everything isn’t just rated in Wh or mWh. It gives them a bigger number to advertise and it’s voltage-independent. Sure there are load-dependent conversion efficiencies that complicate things a bit, but nobody is going to get up in arms about a 5% deviation from the advertised spec due to less than ideal conversion efficiency. Compared to trying to figure out how many recharge cycles I’ll get on my 5000mAh laptop battery from my 20000mAh power bank (what voltage is that laptop battery running at again?) a 5% efficiency drop is a big nothing burger.