This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

questions around SAADC offset calibration

Hi!

I'm using the nrf52 on custom prototype boards and the official development boards. We are using the ADC to read analog values and encountered unexpected behavior related to the saadc offset calibration.

I'm currently using nrf_drv_saadc_calibrate_offset() to perform the offset calibration.

What I can see, both when measuring single-ended and differential (we have a 0.5% LDO on one of the AIN ports), that offset error typically gets worse by running the offset calibration compared to not doing so.

Examples reading the 1.8V 0.5% LDO single-ended, with gain 1/4 and 4X oversample against the internal 0.6V reference:

                   no calib                  with calib	
device      error avg   error stdev	    error avg	error stdev
v6c9        -0.002       0.000709        0.037       0.000739
v6c8         0.014       0.000746        0.012       0.000799
v6c3         0.001       0.000688        0.044       0.000770
v6c31        0.002       0.000734        0.028       0.000630
v6c29        0.005       0.000569       -0.009       0.000709
v6c26        0.010       0.000750        0.002       0.000620
      avg    0.005	     0.000699        0.019       0.000711

These were 6 different prototype boards. What you can see here is the error from what is measured to the reference 1.8V, assuming that the LDO is 100% precise (which it is not, but it typically is very close at room temperature). All measurements have been made at the same temperature (around 22C). Each board was measured with around 100 samples both before and after offset calibration.

What you can see in the bottom row is the average of error averages and error standard deviations. Within the bottom row:

  • the left most number shows that the typical offset error with no offset calibration is 0.5%
  • the third column shows that the typical offset error WITH offset calibration is 1.9%

What you can also see is, that some extremes show around 4.4% error with offset calibration vs. 0.1% error without offset calibration. There are only two cases where offset calibration helped with the value, but in those cases the original non-calibrated offset was already pretty good.

Is this expected behavior? I guess this might still be within the +-3% tolerance of the internal reference, but it's surprising that offset correction makes it worse in so many cases.

What happens when the temperature is far lower or far higher for the cases where the offset is already ~4% wrong? will these devices still stay within the spec of +-3%?

We did a lot more tests with lab power supplies and more devices and also did the same tests on 4 different nordic dev boards, everything behaves very similar to those presented above.

One other thing I was wondering: when doing differential measurements, what's the typical impact on offset calibration on those? Shouldn't the offset calibration be almost not noticeable because only non-linearity errors are present in a differential setup?

Thanks, Peter

  • This could be caused by noise. Can you try calibration with OVERSAMPLE = 5 (32x)?

  • Hi! thanks for the quick reply. Oversampling does not seem to help. I ran test series on some of the devices with 4x, 32x, 128x and 256x oversampling. The difference from 4x to 128x was at maximum around 2 points on the adc (4096 scale), whereas the offset calibration itself, still went around 50 points further from the non-calibrated average to a total of around 90 points off of the actual value (the worst case precision of the input voltage is 15 points on the adc scale).

    Actual values: 1.8V reference input should be 3072, non-calibrated value e.g. 3030, calibrated value 2980.

    I also tried doing calibration once and never again, and also after every sample (time between samples was 3 seconds) which leads to the same results.

  • In this test you will be dominated by gain error, not offset. The gain error will still remain. If you want to check how the offset changes with offset calibration then you need to apply 0 input. Do you get more stable values as you increase the oversampling?

    It's interesting if the gain error changes offset calibration, because it should not. To be able to separate gain and offset error you need two points, 0 and 1.8 V input.

Related