nRF5340 SAADC TASKS_CALIBRATEOFFSET calibration time missing in the product specification

Hello, the nRF5340's SAADC has a temperature dependent offset according to the product specification. The offset can be measured (I suppose it is automatically applied within the SAADC) via the TASKS_CALIBRATEOFFSET task.

I could not find any information about the calibration time in the product specification (v1.5). Did I overlook it or search in the wrong place? There is information about acquision time and conversion time, but not about offset calibration time.

Is there any information available? If it is already in the product specification, please let me know where. Otherwise, please let me know here and if possible, include it in the product specification (also for the nRF54 series). This information is important for modelling the delays in real-time systems during a re-calibration cycle.

Is the offset calibration time static or is it dynamic with a worst case upper limit?

One more short question regarding the input voltage. Is it safe to apply voltages that exceed the full-scale reference voltage to an analog input pin? E.g. internal reference voltage and gain are set to 2.4V, and 3.0V are applied at the pin. My understanding from the Absolute Maximum Ratings section would be that this scenario is physically safe for the SoC (AIN pins have same limits as other pins).

Best regards,
Michael

Parents
  • Hi Michael,

    I haven't heard from the team yet, but out of curiosity, what is your concern with the time here?

    Is it sufficient for you to wait for the calibration completed event, or do you need some planning with strict timing?

    Best regards,
    Hieu

  • Hi Hieu,

    we are using the ADC for a regulation loop, possibly running for a long time without being idle or performing a power cycle. If we enable re-calibration during normal operation, even if it happens very rarely, we have to take the delay into consideration as a worst-case scenario for our real-time system.

    Let's say, if the self-calibration time (including acquisition time, if there is any for self-calibration) is below ~ 20 µs, it would be perfect. If it is below 500-800 µs, it would be acceptable. If it is a couple of ms, we might have to take extra measures to allow the customer to decide between re-calibration and low latency operation.

    So, the best case would be if the calibration interval would fit into our real-time time slices. But it would also be possible to wait for the event to complete (skipping measurements over multiple time slices). What we need is a worst-case value which we can use in our calculations and algorithms.

    By the way, does the ADC's offset also drift over time? Or only over temperature? The PS says it changes over temperature. Wouldn't it then be possible to measure temperature-specific offsets beforehand (during SoC production) and program compensation parameters into the SoC already?

    Best regards,
    Michael

Reply
  • Hi Hieu,

    we are using the ADC for a regulation loop, possibly running for a long time without being idle or performing a power cycle. If we enable re-calibration during normal operation, even if it happens very rarely, we have to take the delay into consideration as a worst-case scenario for our real-time system.

    Let's say, if the self-calibration time (including acquisition time, if there is any for self-calibration) is below ~ 20 µs, it would be perfect. If it is below 500-800 µs, it would be acceptable. If it is a couple of ms, we might have to take extra measures to allow the customer to decide between re-calibration and low latency operation.

    So, the best case would be if the calibration interval would fit into our real-time time slices. But it would also be possible to wait for the event to complete (skipping measurements over multiple time slices). What we need is a worst-case value which we can use in our calculations and algorithms.

    By the way, does the ADC's offset also drift over time? Or only over temperature? The PS says it changes over temperature. Wouldn't it then be possible to measure temperature-specific offsets beforehand (during SoC production) and program compensation parameters into the SoC already?

    Best regards,
    Michael

Children
No Data
Related