Battery level reading using ADC pin from nRF chip shows significant fluctuation

Hi everyone,

We're reading BAT level by reading data from ADC pin and the setup diagram is shown below. 

We tested our device at hospital, and we observed significant fluctuation highlighted in red regarding its BAT level. For example, it went from 65% to 20%, then jumped to 60% again. 

We then took the device and battery back to the office and tried to reproduce the issue on benchtop. However, we couldn't reproduce it. In contrast, the following is what we got, and it worked as expected.

Our hypothesis is that it could be due to the impedance mismatch between ADC input pin and the output of voltage divider. However, based on our calculation, the input impedance of the voltage divider is equivalent to R13//R15 (50kOhm) while the input impedance of ADC pin is more than 1MOhm. This would not cause the impedance mismatch. 

I greatly appreciate if anyone has any suggestions for this issue. Thanks.

  • Thanks  ! 

    the battery will phantom power the ADC pin before the regulator first starts up and supplies VDD; this can create reset problems

    The following is my power circuit

    Is that really a potential problem here? I'll measure the voltage going into ADC pin and VDD to see which one comes first. In the case ADC pin comes first, will a higher resistor value in R13 help mitigate this issue? Or any other ways to improve it with the current setup?

    Reading your answers in some posts, my understanding is that having TX pin or ADC pin in my case high while the nRF52 is not being powered on potentially causes some issues for nRF52 reset. However, I'm still unsure the root cause for my issue with sudden significant fluctuation BAT level. Is that because of phantom power that leads to some unexpected nRF52 working behavior?

    You mentioned about advertising packet on nRF52. From my understanding, it's potential for nRF52 to draw more current due to a connection establishment effort in a noisy environment, leading to shorten BAT life. I noticed there was no issue with establishing a BLE connection as no issue reported about this so far. However, I'll use BLE sniffer to confirm this in the hospital next time. Please correct me if I'm wrong. 

    Another thought of having shorten BAT life when measuring in a noisy environment is that to maintain BLE connection, the device and tablet need to exchange data packet frequently based on the define connection interval. In the noisy environment, high chance that these data packets need to be retransmitted many times, which leads to a high BAT drain.

  • The startup time of the TPS61222 DC-DC would be (say) 3mSec and the startup time of the TLV74330 would be (say) 1mSec so perhaps 4mSec in all. Any connection to a port pin of a voltage prior to that 4mSec is a design flaw and a potential issue most likely with incorrect reset behaviour. Will it damage the nRF52? No. Increasing R13 reduces the risk but does not eliminate it; preferable to simply not enable the connection to the pin until 3V is stable, perhaps with a FET or NO (Normally Open) analogue switch; benefits of this approach allows removal of the two current drain resistors across the battery.

    With a 1.5V battery it is a slightly unusual design choice to boost 1.5V or less to 5V with a DC-DC and then reduce the 5V to 3V with an LDO as power efficiency is reduced. Alternatives are to simply use two DC-DC and no LDO, preferably with synchronous DC-DC operation. 3V for the nRF52 is also slightly unusual these days unless there is some attached component that insists on such a high voltage. 1.8V would be preferable as long as no LDO is used. If LEDs etc have to be driven then use a separate regulator with level switchers just for those components, unless of course battery life is unimportant.

    Regarding the extreme noise, I would hazard a guess that there is some daisy-chain interaction with DC-DC, LDO and nRF52 where the DC-DC requires periodic large current bursts when the battery voltage is low towards end-of-life and is unable to provide such high current bursts. The alternative is a software/hardware bug on the nRF52 with the % calculation or voltage measurement or sleep handling.

    A noisy BLE environment in our experience usually drains batteries faster if there are multiple disconnect-connects rather than simple collisions, though of course excessive collisions will also reduce battery life. A better packet algorithm may be the answer if so; fewer packets with less overhead. Lossless 24-bit 4mSec ECG samples together with temperature, 3-axis orientation and a slew of other stuff requires less than 5 small packets per second with 2M PHY.

    Edit: Worth logging a count of BLE disconnect events; that usually indicates trouble with power even if data throughput is still acceptable; each disconnect-connect costs battery power.

  • On the noisy BLE environment note, I want to add something. Tai said that RSSI is good. However, RSSI is not an indication of connection quality.

    To quote my colleague:

    RSSI is just the power (loudness) of the signal. It says more about the distance to the source than the quality of the signal (but is also affected by room shape and other causes of reflections). To illustrate: The signal could be loud while there are other loud noises that interfere with the signal, giving you a high "good" RSSI, but a bad connection. You need the SNR (signal to noise ratio) to know the quality of the signal.

    What this mean is, even though RSSI was high, your connection quality might still be fairly bad, leading to a lot of retransmissions or disconnections, leading to increase consumption.

    This last reply in a case about signal noise also explained a bit more:
    RE: SNR / PER in nRF Conect SDK 

    As for testing disconnection, it might be worth noting that sometimes the connection can be just bad enough to lose packets frequently but not so much to cause frequent disconnections. What that means is: if you see frequent disconnections, your environment is certainly challenging; but if you don't, that doesn't mean your environment is great. Maybe acceptable.

    On the firmware side, a potential flaw is the application being kept up and waiting for transmission to complete instead of being put to sleep. It's worth checking if this is happening and avoid it.

  •     Thanks for your insights! My initial concern was the abnormal battery discharge curve observed in a noisy environment. However, you guys pointed out flaws in the power design for this device as well as the risk of increasing power consumption due to noisy environment. I really appreciate it!

    One point I'm still unclear on is the significant fluctuation in battery level. While I agree the battery life may be shorter than benchtop tests suggest, the root cause of such fluctuation remains unexplained. Thanks, Tai!

    Update: I reviewed both the data and the video recorded during the period of significant battery level fluctuation. If there had been data loss or retransmission attempts, I would expect to see some lag in the video—but I didn’t observe any.

      

  • Best if you could share the SAADC and % calculation code; maybe we'll spot something ..

Related