Battery level reading using ADC pin from nRF chip shows significant fluctuation

Hi everyone,

We're reading BAT level by reading data from ADC pin and the setup diagram is shown below. 

We tested our device at hospital, and we observed significant fluctuation highlighted in red regarding its BAT level. For example, it went from 65% to 20%, then jumped to 60% again. 

We then took the device and battery back to the office and tried to reproduce the issue on benchtop. However, we couldn't reproduce it. In contrast, the following is what we got, and it worked as expected.

Our hypothesis is that it could be due to the impedance mismatch between ADC input pin and the output of voltage divider. However, based on our calculation, the input impedance of the voltage divider is equivalent to R13//R15 (50kOhm) while the input impedance of ADC pin is more than 1MOhm. This would not cause the impedance mismatch. 

I greatly appreciate if anyone has any suggestions for this issue. Thanks.

Parents
  • Hi Tai,

    I dare not claim to be an expert, but I have been assigned the case, and will try to help.

    This behavior is indeed abnormal. The SAADC on the nRF52 series doesn't have any limitation that would explain that.

    Could you please elaborate what you did in the bench test? I notice that the discharge curve in the field test isn't similar to that in the bench test, assuming that the drain is constant in both scenarios.

    Is it possible that the device underwent any special mode, or was put under abnormal environmental condition during the time with the measurement dip?
    For example, extreme EM noises requiring multiple retries, keeping the radio on for longer, increasing the current draw.

    Is this observed multiple times or just once, and with one particular unit, or multiple units?

    Hieu

  • Hi Hieu. 

    Could you please elaborate what you did in the bench test? I notice that the discharge curve in the field test isn't similar to that in the bench test, assuming that the drain is constant in both scenarios.

    I used exact hardware device and battery used in hospital. I basically turned the device on, connected to our mobile application, then let it run till the battery is fully depleted. I left the device on my desk, which is normal working environment in the office. 

    Side note: The first discharge curve was only 1.5 hr long measurement while the second one conducted on bench test was 8 hr long. 

    Is it possible that the device underwent any special mode, or was put under abnormal environmental condition during the time with the measurement dip?

    It was running under normal mode. For environmental condition at hospital, it has many medical equipment and there is presence of 2.4 GHz frequency component emitted by this equipment. In other words, it is pretty noisy, and we observed several BLE disconnects due to that before.

    However, for this case I did check the RSSI values recorded in our application. It looked normal.

    Is this observed multiple times or just once, and with one particular unit, or multiple units?

    It happened for us twice in different hospitals. For another case, we put a branch new battery, the HW device showed low BAT right away and BLE disconnect happened after a couple minutes.

  • "AAA Battery" - 1 AAA battery or more than 1 battery? If only 1 battery then how does the AAA 1.5V provide the nRF VDD, ie could you share the regulator schematic for the VDD supply? Perhaps also share the calculation for battery % and include the number of readings used, or show the battery voltage instead of the battery %.

  • We used 1 AAA battery (1.5V) and a boost-buck converter to have the voltage of 3V to power the device. The average life span of battery is about 11.5 hrs.

  • There are so many tests (200 +) done before without any issue with the discharge BAT curve. It recently happened, so I'm very confused.

    Btw, I developed this based on the way to measure BAT level as following:

Reply Children
No Data
Related