This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Best way to sample ADC over 2 channels with 2 different sampling rate while saving power?

Hi Devzone:

I am working on nRF52840 with latest SDK. 

I have 1 differential pair and 1 single ended pin I want to sample, single ended samples at every 40ms while differential pair samples at every 5 second.

You may already guessed that 40ms sample rate is for battery current amd 5 second sample rate is for battery voltage. Yes we are trying to measure battery life.

This is easily achievable with sampling both channels at 40ms and discard samples for voltage. But I am afraid it might be a wasteful of resources if samples are discarded.

Searching through the forum i came to be aware of the following potential solutions.

1.

init one channel, sample at 40ms, get samples, unit, 

init another channel, sample at 5 second, unit.

2. use low power mode + burst mode

So please recommend the details of what's the most efficient solution to this problem. I noticed there are many posts on this topic, the problem is that I am not very sure if they are up-to-date.

Please include detail as:

How to enable burst mode? In sdk_config I only saw low power mode enabling option.

How to init and uninit? An example code with a validation that it works with the latest SDK is much appreciated. (below is an example but is 4 years old i don't know if it fits the latest SDK)

https://github.com/NordicPlayground/nRF52-ADC-examples/blob/master/saadc_low_power/main.c

Thanks!

  • Hi,

    This is easily achievable with sampling both channels at 40ms and discard samples for voltage. But I am afraid it might be a wasteful of resources if samples are discarded.

    If you want to avoid reconfiguration, sampling both channels every 40 ms is the only option, as scan mode always samples all channels in sequence.

    init one channel, sample at 40ms, get samples, unit, 

    init another channel, sample at 5 second, unit.

    This seems like the most natural option to me in this case with long sampling intervals. You could use an app timer to handle regular reconfiguration and sampling. This way, the average sampling interval will be as accurate as the LFCLK. There will be jitter, but that should not matter in such a use case.

    2. use low power mode + burst mode

    So please recommend the details of what's the most efficient solution to this problem. I noticed there are many posts on this topic, the problem is that I am not very sure if they are up-to-date.

    Please include detail as:

    How to enable burst mode? In sdk_config I only saw low power mode enabling option.

    To enable burst mode you set the burst field in the configuration struct (nrf_saadc_channel_config_t) to NRF_SAADC_BURST_ENABLED. Note that you typically do not want to combine scan and burst, as described here.

  • Hi Einar, thanks for reaching out to help.

    " You could use an app timer to handle regular reconfiguration and sampling. "

    1. I wonder if you could explain how app timer is used in SAADC project?

    2. Can you provide code as when and how to unit a channel? I don't see a "saadc_uninit" function provided with example project.

    Also I need help changing sampling rate.

    In function "saadc_sampling_event_init"

    we have 

    uint32_t ticks = nrf_drv_timer_ms_to_ticks(&m_timer, 400);

    1. Does this mean saadc_callback will be fired at every 400 ms?

    2. If we need to sample at under 1ms, say 40 us, how do we write the function? It seems 1ms is the limit.

    Thanks.

  • Hi,

    cpeng said:
    1. I wonder if you could explain how app timer is used in SAADC project?

    The app_timer is not used in the SAADC driver or example project in the SDK. However, it is just a simple library that uses an RTC instance to get callbacks sometime in the future. For (re)configuring and sampling the SAADC regularly at a low frequency, you could set up a repeating timer and to the reconfiguration and sampling.

    cpeng said:
    2. Can you provide code as when and how to unit a channel? I don't see a "saadc_uninit" function provided with example project.

    There is no example of un-initing, but essentially you just call the un-init-function(s) as needed after sampling is done. You have the nrf_drv_saadc_channel_uninit() for un-initin channels, and nrf_drv_saadc_uninit() if you want to un-init the driver itself.

    cpeng said:

    In function "saadc_sampling_event_init"

    we have 

    uint32_t ticks = nrf_drv_timer_ms_to_ticks(&m_timer, 400);

    1. Does this mean saadc_callback will be fired at every 400 ms?

    Yes.

    cpeng said:
    2. If we need to sample at under 1ms, say 40 us, how do we write the function? It seems 1ms is the limit.

    The tick rate of the timer is much higher, so you just need to use another conversion function. nrf_drv_timer_us_to_ticks() is also available and should do the trick in this case.

  • Thanks!

    One area i am still blurry is:

    What's the relationship between Low power mode and oversampling/ burst?

    If I want to oversample mutiple channels and have their samples averged out, what should I do? 

    It seems oversampling+scan will mix up samples from different channels and mess up the value. 

    What role does burst mode come into play?

Related