For measuring (lithium -> VDDH) battery voltage, I am looking into how to get the best precision out of the built in SAADC of the nRF52840 and how to quantify that.
By reading through the documentation and the forums I have gathered the following:
- Don't bother with the built in calibration, since it can be weirdly jumpy and we can instead do a differential measurement of VDD against itself to measure the 0V offset of the ADC (since speed is irrelevant).
- Put a low pass RC filter in front of the ADC to reject noise.
- Oversample a lot. It doesn't hurt and it should improve noise.
- Potentially use a 0.1% resistor divider instead of the built in VDDHDIV5 one that is specified as +-1% (cost: ~20 cents).
- The internal 0.6V voltage reference apparently also has a +-3% error (according to the COMP section of the manual). This could be fixed with an external voltage reference (~70 cents).
What is unclear to me:
The datasheet lists "Integral non-linearity, 12-bit resolution" as 4.7 LSB, but also EG1/4 (and other EGs) as +- 3%, which is a lot more than 4.7 LSB. Does that mean the 4.7 LSB only applies at 0? What exactly is the "EG" error?
If it's an error of gain, but linear over the input range, do I understand correctly that it can be compensated by measuring a precise external reference against neutral? (i.e. compensating for the internal reference voltage will at the same time compensate for EG)