Understanding and Improving State-of-Charge Measurement Variability with Nordic 52840 and NPM1300

We are using a Nordic 52840 along with an NPM1300 power manager and rely on Nordic's algorithm (via nrf_fuel_gauge_init and nrf_fuel_gauge_process) to calculate the battery's state of charge (SoC). However, we are observing inconsistencies in the SoC measurements:

  1. The SoC results differ depending on the initial system current during initialization.
  2. A significant tolerance of up to 10% is noted when the battery is used in fast-charging mode (0.5C), and the system is restarted with a low or zero current before re-measuring.

It was my understanding that the Nordic algorithm should compensate for these factors and provide consistent results regardless of the system current during initialization.

What could be the root cause of this discrepancy? How can we enhance or resolve this behavior to achieve more reliable SoC measurements?

Related