This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Best practice for monitoring coin cell battery voltage given ESR

I'm using an NRF51 w/ S130 operated directly from a coin cell battery. I use the ADC with a voltage divider to successfully measure analog voltages. One of the challenges of the coin cell is that its ESR is not insignificant and degrades overtime. This means that if I have a substantial current pulse (for example from RF activity) the battery voltage droops in a meaningful way (sometimes ~200 mV). The ESR is not fixed and usually gets worse over time. Suppose my system cannot operate below 2.4V and I want to monitor the battery voltage and [do something] when there's a low battery condition. What's the best way to do this given the ESR? For example with a multimeter I might see "2.6V", but with a healthy current pulse the battery might actually droop to 2.4 momentarily, which could brownout my system. Is there a way to synchronize sampling the ADC with RF activity so I can measure the worst case battery voltage?

  • Assuming your device is powered straight from a coin cell, have a look at measuring the battery voltage via the bandgap method (this link is for a different microncontroller but the general principle holds). This will reduce your system's quiescent current.

    You should reduce the spikes (and increase the life of your coin cell) via large decoupling capacitors.

    I haven't used the S130 BLE stack, but I've used the EVENT_TX event in the ANT+ stacks to synchronize tasks when the radio is transmitting.

    I'd imagine you could infer the health / capacity of the coin cell by measuring its voltage and comparing against a known voltage vs capacity curve. This curve could be found experimentally or from the coin cell datasheet.

  • What kind of current pulses are we talking about? Have you tried measuring the current using the Nordic Power Profiling Kit yet?

  • An easy solution I could suggest would be to take a simple discharge test using a fully charged coin cell battery. Then, by doing a constant current pulse (e.g. somewhere between 200-300 mA) for a few minutes (say 10 or 20) and then stop the current pulse after the time limit. Then, you should get a graph that is similar to this:

    image description

    This graph was found on Prof. Gregory Plett's homepage in the Chapter 2 PDF file.

    From this graph, you can approximate the R_0 value (i.e. ESR) crudely from Ohm's law: V=R_0*I. Since you can calculate the change in voltage and you know the Delta I (i.e. the difference between the current pulse and 0 A), you can approximate R_0. Like you said, the ESR does increase as the state of health of the battery decreases. So maybe, you could try this on a "used" or multiple used battery, but which still has/have a bit of charge left in it.

    It is always a possibility to do some battery testing & find a better estimate of R_0, but this would take more time & computing resources. I would try this first & if you don't think it's a good idea we can discuss more advanced methods.

  • @rct42: I currently use the internal bandgap as the reference voltage.

    I have 10 uF bulk capacitance on board and have tried as much as 20 uF with similar results.

    That's interesting -- when you use the EVENT_TX are you ensured that the sampling actually takes place during the radio transmitting or is it simply synchronized to the transmission?

    Since I'm not trying to do any sort of fuel gauge I think the easiest thing to do is measure the worst case battery V (e.g. during the droop).

  • @Bjorn -- yes I've used the PPK. My spikes are on the order of 15 mA, but with an ESR of ~20 ohm on a coin cell this can result in substantial voltage drop.

Related