How to measure Lithium battery voltage with the nRF51 ADC
Attached is a hardware setup for the ADC which is a voltage divider with a connected capacitor. It is meant for measurement of LiIon battery with voltage range of 2.7-4.2V. For this purpose, we recommend…
Is there anyway to get these links working again? They don't seem to be functional. Also, I'm assuming that the output of the voltage divider is unaffected by the capacitor.
So with a fully charged…
I have refreshed the links.
Actually, the maximum voltage that is input to the ADC for this setup is approximately:
4.2V * (2.2M/(2.2M+10M))=0.76V
and the minimum voltage is approximately
Attached is a hardware setup for the ADC which is a voltage divider with a connected capacitor. It is meant for measurement of LiIon battery with voltage range of 2.7-4.2V. For this purpose, we recommend to divide your input voltage with two resistors, R1 = 10Mohm and R2 = 2.2Mohm. Then you need to connect a 22nF capacitor (C2 in the attached schematic) from your ADC analog input pin (AIN) to ground. With this configuration you can sample with up to 20Hz for 8-bit sampling and up to 8Hz for 10-bit sampling without noticable ADC output error. You should use the internal VBG voltage as reference which is 1.2V fixed, and no input prescaling (1/1).
To test this setup, use the rtc0_triggering_ADC_sample_from_GPIO_pin example. It samples from analog intput pin 6 and is set up for the hardware configuration mentioned above. The example outputs the ADC sampled result to port 1, i.e. pins 8-15. It is tested for nRF51 SDK 5.2.0. For other nRF51 SDK versions and other ADC examples, check the Github examples
With the setup mentioned above the voltage divider consumes current of 4.2V/(12,2Mohm)=0.35 uA.
The schematics for the voltage divider setup is shown in this blog post
The above values for the voltage divider resistor values and capacitor have proven to be good for Lithium-Ion battery setup. However, it is possible to choose other resistor values if desired to e.g. increase usable resolution of the ADC for battery reading, see other replies and comments on this thread. However, by changing resistor values, another capacitor size may be needed to prevent ADC output error. The following draft document shows how to calculate capacitor size and maximum sampling frequency for the voltage divider.
Update 22.12.2014 Capacitor size calculation method updated. The former one was incorrect
ADC voltage divider - calculating capacitor size v2.pdf
Update 19.12.2014 - Evaluating ADC output
Mulimeters and oscilloscopes typically have resistance of 1Mohm to 10Mohms, so they will generate mesurement error if trying to measure voltage on the ADC input. I have found the voltage previously on the AIN input by measuring the voltage on the voltage source and then measuring the actual resistance of the resistors R1 and R2, which typically have tolerance of 1% or 5%. Then I would calculate the voltage inside the voltage divider with
V_AIN = V_1 * R2/(R1+R2)
and compare that voltage to the output value of the ADC. The ADC might also have offset and/or gain error, so calibration is recommended to obtain maximum ADC accuracy, as described here.
A customer Francois provided an excel sheet to calculate capacitor value based on the formulas above for arbitrary resistor values. Ill attach his excel sheet here for convenience
So with a fully charged lithium ion batter (4.2V) you would expect to see a voltage of
4.2*(2.2/10) = .924V
Is the functionality of the circuit as follows? The capacitor is just acting to store up some charge to dump charge into the internal capacitor used by the ADC. It is essentially acting as a low impedance voltage source for the ADC sampling circuit?
2.7V * (2.2M/(2.2M+10M))=0.49V
Your assumption is correct, the capacitor charges up before sampling, and holds the voltage steady for adequate time during the sampling period. The voltage will of course drop a little bit but the capacitor is dimensioned so that discharging during the sampling period (68 microseconds)is less than what corresponds to 1 bit when sampling with 10-bit resolution, so the error is not noticable.
Would it be possible to adjust resistor values to set the 4.2V sample closer to the 1.2V reference? To maximize the range of the ADC? Or is this the best possible setup for a 4.2V battery the one listed above?
The thought behind this setup is to have the voltage in the ADC input pin as close to 0.6V as possible. Then the current flowing in and out of the ADC is minimal and you can therefore have a small capacitor. If you would set the maximum ADC input voltage close to 1.2V then you would need a larger capacitor, perhaps 100 nF, which would also take longer to charge up between samples, and would limit the maximum sampling frequency a little more. But if you sample the battery with <1Hz then you could set R1=6Mohm and C=100nF which would lead to:
Maximum voltage on ADC input: 4.2V * 2.2M/(2.2M+6M) = 1.126V
Maximum ADC output value, 10-bit sampling: 1.126/1.21023= 961
Minimum voltage on ADC input: 2.7V * 2.2M/(2.2M+6M) = 0.724V
Minimum ADC output value, 10-bit sampling: 0.724/1.21023 = 617
Usable ADC resolution is: 961- 617 = 344
To compare, usable ADC resolution for the 10Mohm setup is 230