I am completing a schematic using the nRF51822 part. I will be supplying the part with a 3.3V supply. I will use one of the ADC inputs to sense battery voltage (up to 10v max and 4v min). A voltage divider circuit will obviously be needed on the AIN pin.
From the datasheet, it looks like VREF_EXT should be 1.2V and no greater than 1.3V. So does this mean if I am using an external voltage reference (my 3.3V supply) that I need a voltage divider on the AREF0 pin to decrease the voltage from 3.3V to 1.2V?
Is there any data on the accuracy of the internal reference voltage over temperature? Are there benefits to using the internal reference vs the external reference other than gaining an additional GPIO?
Would a capacitor in parallel to my AIN pin help with the accuracy of the ADC readings? If so, what would be an appropriate value?
The external reference voltage can be 0.83V - 1.3V, as listed in the nRF51822 PS. If you want to measure battery voltage you would need to use fixed reference voltage. The VBG reference voltage which is fixed at 1.2V is convenient for this purpose. You then use the prescalers to fit your input voltage within the 0V - 1.2V range of the reference voltage. Please look at this thread in order to see saturation points and maximum input values for different prescale settings..
I currently do not have this data available. I will have to respond back to this thread if I get this data, hopefully within 24 hours.
If you vant to measure voltage larger than the maximum 3.6V then you can connect a high impedance voltage divider to lower the voltage to the ADC input pin. Other methods are to use a voltage buffer or FET transistor in conjunction with a low resistance voltage divider.
Conciderations when lowering voltage for ADC input pin:
The impedance of the input voltage source, i.e. the device that generates the voltage to be measured by the ADC input pin, must be low, preferably within 1 kohm. If the impedance of the source is below 1 kohm the ADC error specification in the Product specification (nRF51822 PS) is valid and choosing different prescale settings for the ADC input will have close to no effect on the accuracy of the ADC. On the other hand, when applying voltage source to the ADC input pin with high impedance, additional gain and offset error is introduced which is different for different prescalers. Choosing high impedance voltage divider is on the other hand desirable to prevent high leakage current though the voltage divider.
Please take a look at these three threads in order to realize better how to dimension you voltage divider, and also to see the ADC model which will help you calculate the size of the capacitor that you need to eliminate any ADC output error.
you can also determine a size for capacitor for your voltage divider with trial and error method instead of using calculations.
As a starting point, follow the guidance for measure Lithium battery voltage, given in the thread above.
If you voltage range is different than for a lithium battery you need to dimension the resistors of you voltage divider so that the voltage is appropriate.
If your resisor values are smaller then in the lithium battery example, you need to insert a larger capacitor. If the resistor values are larger then you might get away with having a smaller capacitor.
If your chosen capacitor gives no ADC error for small sampling frequency (<1Hz) then the capacitor is large enough.
If you prefer to have higher sampling rate, increase your the sampling rate until you start to experience ADC output error. Then keep the sampling rate below that limit.
To make the ADC output values with highest accuracy and with high sampling frequency for input voltages higher than the ADC voltage range, a voltage buffer is needed.
Another possible method is to connect a FET transistor between the power supply and the voltage divider which will open for current through the voltage divider momentarily before sampling. The voltage divider would then have low resistor values (<1kohm) and no capacitor connected. The voltage divider would then consume high current when sampling but will not consume any current when not sampling.
The VBG reference voltage variance over temperature changes of the nRF51 is plus/minus 1%. This is in sync with the nRF51822 PS v2.0 table 45 of 200ppm/degC variation. This VBG variation is close to linear where the lowest VBG voltage is obtained at the lower temperature limit of the nRF51 (-25 deg C) and the highest VBG voltage is obtained at the higher temperature limit of the nRF51 (75 deg C).
The limits for the internal reference voltage is however plus/minus 1.5% according to table 45. The remaining plus/minus 0.5% depend mostly on the supply voltage variation of the nRF51.