Friday 16 November 2012

Non linear characteristics in a draining battery on the Nexus 7

Measuring power consumption on low power devices really is not as simple as running tools such as PowerTop and then assuming the data is trustworthy.  I shall explain why.

With Ubuntu on the Nexus 7, the battery driver originally provided battery capacity in terms of percentage full, which lacked precision to make any sane power consumption estimates.   We tweaked the battery driver so that we could get battery capacity in terms of uWh from the bq27541 battery fuel gauge.  From this, one can measure the change in capacity over time and estimate the power consumed by the device.

For the Ubuntu 12.04 Precise release I wrote the lightweight power measurement tool "powerstat" to try to determine power consumption using the change in battery capacity.  Powerstat can gather changes in the battery capacity level and by using a simple sliding window on the samples it gives an estimate on the power being consumed over time.  With laptops that consume a lot of power this provides a reasonable rough estimate of power consumption.

The tweak to the Nexus 7 battery driver allows powerstat to work on the Nexus 7 running Ubuntu.  So how trustworthy is the battery data from the battery fuel gauge?  Is the data reliable if we repeat the test under the same conditions?  Do we get consistent readings over time?

For my first set of tests, I fully charged the Nexus 7 and then fully loaded the 4 CPUs with busy loops and then ran multiple powerstat tests; powerstat gathers samples over 8 minutes and estimates power consumption. It also calculates the standard deviation from these samples to give us some idea of the variability of the battery power measurements.    For each powerstat test I logged the battery voltage level, the % battery capacity (normalized to a range of 0..1 to make it easier to plot), the estimated power consumption (with its standard deviation) and then plotted the results:
With this test the machine is in a steady state, we are not changing the load on the CPUs, so one should expect a steady power measurement.  But as one can see, the battery gauge informs us that the voltage is dropping over time (from ~4V down to ~3.25V) and the estimated power also varies from 4.6W down to 3.3W.  So, clearly, the power estimate will depend on the level of charge in the battery.

I also measured an idle machine:

Again, voltage drops over time and estimated power drops too.  More interesting is that the estimated power measurement is not particularly smooth over time as shown by the plot of the standard deviation too.   We can therefore conclude that a lightly loaded machine has a lot of variability in the estimated power consumption data and this means we cannot realistically measure subtle power optimization tweaks made to the software as there is just too much variability in the data.

I re-ran the idle test over several days, running from the same fully charged state to a completely empty battery, and compared runs.  I got variability in the duration of the test (+/- 5%). Also, comparing estimated power consumption at the 100%, 75%, 50% and 25% battery capacity points also shows a lot of variability. This means one cannot get accurate and repeatable power estimations even when the battery is charged at specific capacities.

So next time somebody tells you that the latest changes made their low power device suck more (or less!) power than the previous release and their findings are based on data derived from battery fuel gauge, take it with a pinch of salt.  

The only reliable way to measure instantaneous power consumption is using specialised precision equipment that has been accurately calibrated.

5 comments:

  1. In active (normal) mode, the bq27541 samples at one-second intervals. powerstat samples at 10sec intervals. Increasing the sample rate in powerstat should reduce sampling error; in practice does that provide cleaner results? Similarly, a 10-second sample rate risks aligning the samples with tasks scheduled at round-number intervals like 60/120sec. Does an 'odd' sample rate like 7 or 9 seconds provide cleaner data by reducing the likelihood of samples being taken during the execution of periodic tasks?

    ReplyDelete
  2. Well, that is something that can be investigated, but we are talking about a small amount of variability in the data. Suppose we could be 100% accurate, the bigger issue is the non-liner discharge characteristic of the battery which produces different readings depending on how full the battery is. So, it really isn't worth pursing. For the kinds of measurements we need to see if wake up reduction or rendering optimisations help, we are looking at minute changes which simple are not accurately measurable from the battery. One argument is that we should charge the battery up and run the software continuously and see how long it takes to discharge and that will show us how much more efficient the code has become. However, as I have already stated, the durations on steady state (idle or all 4 CPUs spinning) are variable and also it can take 7-12 hours to drain the battery. I am really arguing that if we want to get accurate power readings we should use well calibrated measuring equipment and forget using the battery.

    ReplyDelete
  3. Even if the battery gauge was 100% accurate, it would only be true for that cycle of that battery. Due to manufacturing variations, each battery will have a different characteristic. Also each cycle will change due to ageing effects of the battery. Finally, temperature will have a significant effect, so at the very least these tests need to be done in a controlled environment.

    ReplyDelete
  4. It also calculates the standard deviation from these samples to give us some idea of the variability of the battery power measurements.

    ReplyDelete