This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

SAADC scan + burst?

On SDK16.0.0 + Mesh SDK 4.1.0

I know that oversampling when using the SADDC in scan mode on multiple channels does not work because the buffer gets all out of order.

But I believe burst mode can be used in scan mode where it samples the channel multiple times as fast as possible, averages it, and then puts the result in the buffer in the right order. Burst mode uses the oversample setting for setting how many readings to average. Is this all correct?

Does this work in SDK16.0.0 by simply setting nrf_drv_saadc_config_t .oversample value and nrf_saadc_channel_config_t .burst value? I did this and everything seems to be working, but I don't know if it's actually doing it. I initially tried to use nrf_saadc_burst_set(channel, NRF_SAADC_BURST_ENABLED) to enable burst for each channel, but that did not work and the readings were all wrong.

Or do some modifications need to be made like in this thread? https://devzone.nordicsemi.com/f/nordic-q-a/26659/saacd-scan-oversample/

Thanks.

  • Adding DEBUG to the preprocessor defines didn't change anything, no assert.

    I set the PPI to trigger every second. You make it sound like when low_power_mode is set to 1, it should take longer to sample, but when I set it to 1, it is sampling much faster, around 10 times per second.

    Thanks.

  • Hello,

    Thank you for your patience.
    I just ran the project you provided. My only modifications were to some of the preprocessor include directories, and setting the DEBUG flag.
    I am able to reproduce the issue, and I clearly see the behavior you have described about the buffer being shifted by one following the first calibration, and afterwards staying consistent.

    ftjandra said:
    I set the PPI to trigger every second. You make it sound like when low_power_mode is set to 1, it should take longer to sample, but when I set it to 1, it is sampling much faster, around 10 times per second.

    I tested with LP_mode as well, with the same result.
    I was not able to see the behavior you described here regarding seemingly more frequent sampling occurring in Low-Power mode, but on my end the buffer shift happens with Low-Power mode as well.

    I will now  delve deeper into this issue, and see if I can not figure out its root cause and how to negate it. I will update you as soon as I have something.

    Best regards,
    Karl

  • Hello again,

    I have done some more testing, and I might have identified the source of the error.
    The START task seems to be called too fast after the CALIBRATE DONE event happens.
    I observe that adding a >= 3 ms delay to the CALIBRATE DONE event handler, in before the buffer convert function calls, negates the buffer shift all together.
    I have ran this multiple times now, for extended periods, and not seen any buffer shifts happening. Please try this on your end as well, and let me know what you observe.
    This is the addition which you might make to your project:

        else if(p_event->type == NRF_DRV_SAADC_EVT_CALIBRATEDONE) {
            __LOG(LOG_SRC_APP, LOG_LEVEL_INFO, "SAADC calibration complete\n");
            saadc_calibrate_stage = 0; //reset
          
            // INSERTED DELAY
            // 2 ms delay yields buffer shift by 1 position following 2nd calibration.
            // 3 ms delay yields constant buffer, no shift.
            nrf_delay_ms(3);
    
            //Need to setup both buffers, as they were both removed with the call to nrf_drv_saadc_abort before calibration
            ERROR_CHECK(nrf_drv_saadc_buffer_convert(saadc_buffer_pool[0], SAADC_SAMPLES_IN_BUFFER));
            ERROR_CHECK(nrf_drv_saadc_buffer_convert(saadc_buffer_pool[1], SAADC_SAMPLES_IN_BUFFER));
    
            nrf_drv_timer_enable(&m_saadc_timer);
        }

    You will also have to add the nrf_delay.h file to your project.

    This issue is especially strange since the way you have implemented the CALIBRATE DONE event handler is the normal way to setup the buffers following a calibration, which leads me to believe this might be an artifact of the scan + oversampling + burst configuration. This is just my current suspicion, which I will continue working on to verify. I have opened an internal request to have this reviewed by the module's engineers too.
    Thank you for bringing this up, this is absolutely an interesting find!

    Best regards,
    Karl

  • I can confirm that adding the 3ms delay stops the buffer shift. But isn't it odd that before adding the delay it only shifted once after the first calibration and never again? I would expect it to shift every time.

    I will wait for your follow up before I put this into production firmware.

    Thanks.

  • Hi,

    ftjandra said:
    I can confirm that adding the 3ms delay stops the buffer shift. But isn't it odd that before adding the delay it only shifted once after the first calibration and never again? I would expect it to shift every time.

    Yes, I too find this odd as I would expect it to happen at other times also. All my tests thus far indicate that it does not, however. So, it is absolutely interesting.
    I am in the process of creating a bare-metal demonstration of this behavior, to begin debugging of the hardware and to isolate the driver from the problem.
    I hope to know more about this soon.

    ftjandra said:
    I will wait for your follow up before I put this into production firmware.

    Great, I will get back to you on this as soon as I have got something to share.
    I have asked for a meeting with the HW and SAADC driver engineers as soon as possible to discuss this.

    Best regards,
    Karl

Related