Hi, Everyone,
We are developping BLE project on nRF52832 chip.
We need to measurethe battery voltage to determine the remmended charge in it.
For this purpose we activated the ADC that measures VCC voltage.
We configured it to the next parameters:
ADC configuration:
Resolution = 10bit
Oversample = Disabled
Low Power Mode = Enabled
IRQ priority = 7
Channel configuretions:
Resistor P, Resistor N disabled.
Gain = 1/6
Reference voltage = Reference internal
Acquisition time = 40us
Mode = single ended
Burst = Disabled
Pin p = Input VDD
Pin n = Input Disabled
Each 5 minutes we make calibration when the first calibration is run immediately.
The difference between different chips from the same voltage was 12mV(2smp) with calibration and 21mV(3smp) without calibration.
We tried to increase the resolution from 10 bit to 12 and even to 14 bits, but the effective not changed to be better. The difference in mV became even bigger.
According to the formula in the link: devzone.nordicsemi.com/.../how-to-calculate-battery-voltage-into-percentage-for-aa-2-batteries-without-fluctuations
the error in 10mV gives the error in charge value up to 7% when the charge is between 100% and 42% (3.0V - 2.9V). How to solve the problem with this error that varies from chip to chip?
Best Regards
Boris Fridman