answered
20131216 12:51:16 +0100
Hi
 The external reference voltage can be 0.83V  1.3V, as listed in the nRF51822 PS. If you want to measure battery voltage you would need to use fixed reference voltage. The VBG reference voltage which is fixed at 1.2V is convenient for this purpose. You then use the prescalers to fit your input voltage within the 0V  1.2V range of the reference voltage. Please look at this thread in order to see saturation points and maximum input values for different prescale settings..
2.
I currently do not have this data available. I will have to respond back to this thread if I get this data, hopefully within 24 hours.

If you vant to measure voltage larger than the maximum 3.6V then you can connect a high impedance voltage divider to lower the voltage to the ADC input pin. Other methods are to use a voltage buffer or FET transistor in conjunction with a low resistance voltage divider.
Conciderations when lowering voltage for ADC input pin:
The impedance of the input voltage source, i.e. the device that generates the voltage to be measured by the ADC input pin, must be low, preferably within 1 kohm. If the impedance of the source is below 1 kohm the ADC error specification in the Product specification (nRF51822 PS) is valid and choosing different prescale settings for the ADC input will have close to no effect on the accuracy of the ADC. On the other hand, when applying voltage source to the ADC input pin with high impedance, additional gain and offset error is introduced which is different for different prescalers. Choosing high impedance voltage divider is on the other hand desirable to prevent high leakage current though the voltage divider.
Please take a look at these three threads in order to realize better how to dimension you voltage divider, and also to see the ADC model which will help you calculate the size of the capacitor that you need to eliminate any ADC output error.
https://devzone.nordicsemi.com/index....
https://devzone.nordicsemi.com/index....
https://devzone.nordicsemi.com/index.....
you can also determine a size for capacitor for your voltage divider with trial and error method instead of using calculations.
[list=1]
As a starting point, follow the guidance for measure Lithium battery voltage, given in the thread above.
If you voltage range is different than for a lithium battery you need to dimension the resistors of you voltage divider so that the voltage is appropriate.
If your resisor values are smaller then in the lithium battery example, you need to insert a larger capacitor. If the resistor values are larger then you might get away with having a smaller capacitor.
If your chosen capacitor gives no ADC error for small sampling frequency (<1Hz) then the capacitor is large enough.
If you prefer to have higher sampling rate, increase your the sampling rate until you start to experience ADC output error. Then keep the sampling rate below that limit.
[/list]
To make the ADC output values with highest accuracy and with high sampling frequency for input voltages higher than the ADC voltage range, a voltage buffer is needed.
Another possible method is to connect a FET transistor between the power supply and the voltage divider which will open for current through the voltage divider momentarily before sampling. The voltage divider would then have low resistor values (<1kohm) and no capacitor connected. The voltage divider would then consume high current when sampling but will not consume any current when not sampling.