This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

questions around SAADC offset calibration

Hi!

I'm using the nrf52 on custom prototype boards and the official development boards. We are using the ADC to read analog values and encountered unexpected behavior related to the saadc offset calibration.

I'm currently using nrf_drv_saadc_calibrate_offset() to perform the offset calibration.

What I can see, both when measuring single-ended and differential (we have a 0.5% LDO on one of the AIN ports), that offset error typically gets worse by running the offset calibration compared to not doing so.

Examples reading the 1.8V 0.5% LDO single-ended, with gain 1/4 and 4X oversample against the internal 0.6V reference:

                   no calib                  with calib	
device      error avg   error stdev	    error avg	error stdev
v6c9        -0.002       0.000709        0.037       0.000739
v6c8         0.014       0.000746        0.012       0.000799
v6c3         0.001       0.000688        0.044       0.000770
v6c31        0.002       0.000734        0.028       0.000630
v6c29        0.005       0.000569       -0.009       0.000709
v6c26        0.010       0.000750        0.002       0.000620
      avg    0.005	     0.000699        0.019       0.000711

These were 6 different prototype boards. What you can see here is the error from what is measured to the reference 1.8V, assuming that the LDO is 100% precise (which it is not, but it typically is very close at room temperature). All measurements have been made at the same temperature (around 22C). Each board was measured with around 100 samples both before and after offset calibration.

What you can see in the bottom row is the average of error averages and error standard deviations. Within the bottom row:

  • the left most number shows that the typical offset error with no offset calibration is 0.5%
  • the third column shows that the typical offset error WITH offset calibration is 1.9%

What you can also see is, that some extremes show around 4.4% error with offset calibration vs. 0.1% error without offset calibration. There are only two cases where offset calibration helped with the value, but in those cases the original non-calibrated offset was already pretty good.

Is this expected behavior? I guess this might still be within the +-3% tolerance of the internal reference, but it's surprising that offset correction makes it worse in so many cases.

What happens when the temperature is far lower or far higher for the cases where the offset is already ~4% wrong? will these devices still stay within the spec of +-3%?

We did a lot more tests with lab power supplies and more devices and also did the same tests on 4 different nordic dev boards, everything behaves very similar to those presented above.

One other thing I was wondering: when doing differential measurements, what's the typical impact on offset calibration on those? Shouldn't the offset calibration be almost not noticeable because only non-linearity errors are present in a differential setup?

Thanks, Peter

Parents Reply Children
No Data
Related