I’m searching for some more detailed information of the SAADC input impedance.
Please see attached file for details and questions.
I’m searching for some more detailed information of the SAADC input impedance.
Please see attached file for details and questions.
a) The R_INPUT modells leakage path's, for example, leakage of ESD clamps, thus this is a hard question to answer, there will be device to device variation, and large temperature variation (lower resistance when hot). What we've measured is in excess of 50 M Ohm. In general, we're here into leakage currents of nA,
b) R_INPUT does not change with gain setting. For the other settings I think it depends on what you mean by "change". During sampling, the network of transistors connected to the pad will be different from the transistors during idle, so there is a theoretical change in the amount of leakage (pA). R_INPUT is not a physical resistor, but rather a model of all the leakage paths from the pad, to VDD, and ground.
c) The value for R_ladder is given for typical process, however, this could vary up to +-20 % for a typical silicon process technology
d) C_sample will depend on process variation (device to device), which could be up to +-20 % in a typical silicon process technology, in addition the C_sample changes with gain setting.
Maybe you could elaborate on your source resistance? It would be helpful to understand your application.
Thank you for the input.
I'm using an ADC as battery monitor using a simple resistor divider and want to predict the obtainable precision of the voltage measurement. The nRF52810 is running at regulated 1.8V and during battery monitoring single ended ADC input and internal reference is used. I was looking at this case for inspiration:
https://devzone.nordicsemi.com/b/blog/posts/measuring-lithium-battery-voltage-with-nrf51
and was wondering how Mohm in resistor divider is compatible with ADC datasheet values of Rinput in the same Mohm range.
For nRF52810, the datasheet says "Rinput >1Mohm (typically)".
If using an equivalent source resistanse of e.g. 400k, it makes a significant difference of the ADC value sampled, if the equivalent Rinput is 1Mohm or 10Mohm.
I expect you to have a specification stating a typical and a guarenteed minimum value of Rinput over the temperature range. Else the precision of the ADC measures cannot be predicted.
Thank you for the input.
I'm using an ADC as battery monitor using a simple resistor divider and want to predict the obtainable precision of the voltage measurement. The nRF52810 is running at regulated 1.8V and during battery monitoring single ended ADC input and internal reference is used. I was looking at this case for inspiration:
https://devzone.nordicsemi.com/b/blog/posts/measuring-lithium-battery-voltage-with-nrf51
and was wondering how Mohm in resistor divider is compatible with ADC datasheet values of Rinput in the same Mohm range.
For nRF52810, the datasheet says "Rinput >1Mohm (typically)".
If using an equivalent source resistanse of e.g. 400k, it makes a significant difference of the ADC value sampled, if the equivalent Rinput is 1Mohm or 10Mohm.
I expect you to have a specification stating a typical and a guarenteed minimum value of Rinput over the temperature range. Else the precision of the ADC measures cannot be predicted.
Hi,
Have you taken a look at this one?
https://devzone.nordicsemi.com/b/blog/posts/measuring-lithium-battery-voltage-with-nrf52
The ADC on nRF51 is a different architecture and the concerns from that do not translate to nRF52.
On nRF51 there was DC current flowing into the ADC, while on nRF52 there is an AC current given by the sampling frequency, internal cap, and voltage, which is why we recommend max 800 kOhm source resistance in the PS for TACQ = 40 us
Thank you for pointing out the difference. It would have been very valuable to have a short description of this in the datasheet...
So I read it like the nRF52 SAADC input impedance consist of a DC leakage path (nA range) and a AC load introducing an equivalent resistance calculated by the formula:
Rinput = 1 / (fsample * Csample)
Right understod?
A few more questions:
1) In your first answer above (subject b), the R_INPUT you talk about is the DC leakage (independent of gain) and not the equivalent resistance determined by 1 / (fsample*Csample) because Csample is stated to be dependent on gain setting. Right?
2) What is the relation between the stated typical value "Rinput>1Mohm" in the datasheet and the formula "Rinput=1/(fsample*Csample)"?
Using maximum sampling rate (200kHz) and maximum value of Csample (2.5pF), the calculated Rinput makes 2Mohm.
3) Regarding electrical specifications in the datasheet:
What is the temperature conditions for the typical values stated? 25degC?
What is the temperature conditions for the max/min values stated? -40degC to 85degC?
1) Correct R_INPUT != 1/(fsample*Csample), The R_INPUT is purely to model the DC leakage. It does not include the effects of the sample cap
2) No relation between the two, they are independent factors. They both contribute to the apparent current that goes into the SAADC though.
3) This can be seen in the Recommended Operation conditions of the product specifcation of the product that you're using
Sorry, I thought I got it, but now I feel like I'm back to square one... I don't know what SAADC input impedance spec to use for my design...
Lets try to use some more dedicated definitions to avoid mix up things:
I think I understand that the SAADC input impedance (lets call this Z_SAADC) consist of the DC leakage (lets call this R_DC) and the equivalent impedance because of the switched capacitor Csample (lets call this Z_CAP).
a) The specification in the datasheet "Input resistance - Rinput>1Mohm (typically)", how shall this be interpreted then?
Above, you clearly say that this datasheet term "Rinput" is NOT related at all to Z_CAP.
If the datasheet term "Rinput" is supposed to be interpreted as the R_DC, then I have to deal with a worst case DC input impedance of 1Mohm, which cannot be the case according to the referred battery monitor application. https://devzone.nordicsemi.com/b/blog/posts/measuring-lithium-battery-voltage-with-nrf52
I'm confused. Please clarify!
For my design I need to know what worst case load the SAADC can put onto the circuit connected to the SAADC port.
b) Regarding temperature conditions for electrical specifications:
I think you misunderstand me. Lets look at an datasheet example, even that my question relates to ALL electrical specifications stated in the datasheet. Lets look at the specification of the pull-up resistance: Min 11kohm, typ 13kohm, max 16kohm.
By which temperature conditions (and supply voltage conditions as well) are these three values stated?
Are all three values only valid for the nominal operational temperature (25degC) and nominal supply voltage (3.0V)?
(In case, how to get values for the temperature range, which is very relevant in many places)
or
Are all three values valid for the recommended operational temperature range (-40degC to 85degC) and supply voltage range (1.7V-3.6V)?
or
Is the typical value valid for 25degC/3.0V, and the min and max values valid for -40degC to 85degC and 1.7V-3.6V?
or
something else?
Would be valuable if this was much more clearly stated in the datasheet. Your competitors often state something like this: "Electrical Characteristics over recommended ranges of supply voltage and operating free-air temperature (unless otherwise noted)"
Thank you for your patience :-)
a) Sorry if this is confusing. I agree that Z_SAADC is a good term, because that would then be Z_SAADC = (R_DC || 1/(f_sample * C_sample)). The datasheet does not say that R_DC = 1MOhm, but says R_DC > 1 MOhm. To qualify this number we have measured the input resistance, and based on those measurements we know that it's > 50 MOhm at typical conditions.
b) For that specific case (pull-up resistance) the min/max would cover voltage, process and temperature variations.