This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

How the timer tick, ADC acquisition time, and sampling rate works

Hello, DevZone!

Recently, I've read a lot of questions and replies related to the nRF5's SAADC and BLE.

Even though I found some pages discuss about the continuous sampling with use of timer, the concept is still confusing me.

First of all, I want to build a code that continuously reads voltage from single channel and send it via Bluetooth.

To do so, my colleague mixed two examples. One is ble_peripheral\ble_app_uart and another is peripheral\saadc of nRF5 SDK v16.0.0.

In the early stage of development, we found that the code is working as we expected–SAADC sampled data and they transmitted to the Android device via bluetooth.

Then, to make use of Bluetooth 5's high throughput, I modified code to sample data with 20 kHz sampling rate.

According to the throughput demo, I can get 1365 Kbps with BLE 5 high speed mode. It means that 20 kHz * 16 bits (actually 12 bits but the packet is built with byte unit) = 320 Kbps should be easy to achieve.

The problem is that my code is not so fast. I'm using 7.5ms connection interval and GAP event length is set to 6. Data length extension is already enabled and the packet consists of 240 data bytes (I don't know but MTU=247 setting keeps crashing so I lowered a little).

Currently, I found that NRF_ERROR_RESOURCES keep appears from ble_nus_data_send() and I think it's because I send data to frequently.

Then, maybe I can slow down the ADC and find out what is the maximum throughput of now.

Phew, here comes my main question:

How can I set sampling rate to the specific value?

If I lower the transfer rate, sampling rate also should be lowered down to guarantee low delay.

So I wrote the code like this

#define SAMPLES_IN_BUFFER               10
#define SAMPLES_TO_SEND                 120

static nrf_saadc_value_t                adc_buf[2][SAMPLES_IN_BUFFER];
static uint8_t                          to_snd_buf[SAMPLES_TO_SEND * 2];
static uint16_t                         idx_snd_buf = 0;

(...)

void saadc_sampling_event_init(void)
{
    (...)
    
    uint32_t ticks = nrf_drv_timer_us_to_ticks(&TIMER_ADC, 100);
    nrf_drv_timer_extended_compare(&TIMER_ADC,
                                   NRF_TIMER_CC_CHANNEL0,
                                   ticks,
                                   NRF_TIMER_SHORT_COMPARE0_CLEAR_MASK,
                                   false);
    nrf_drv_timer_enable(&TIMER_ADC);
    
    (...)
}

void saadc_callback(nrf_drv_saadc_evt_t const * p_event)                            // Original version
{
    if (p_event->type == NRF_DRV_SAADC_EVT_DONE)
    {
        ret_code_t err_code;

        err_code = nrf_drv_saadc_buffer_convert(p_event->data.done.p_buffer, SAMPLES_IN_BUFFER);
        adc_buf_state ^= 0x01;
        APP_ERROR_CHECK(err_code);

        for(i=0; i<SAMPLES_IN_BUFFER; i++)
        {
            to_snd_buf[idx_snd_buf*2] = adc_buf[adc_buf_state][i] & 0xff;
            to_snd_buf[idx_snd_buf*2 + 1] = (adc_buf[adc_buf_state][i] >> 8) & 0xff;
            idx_snd_buf++;
        }

      
        if (idx_snd_buf >= SAMPLES_TO_SEND)
        {
            uint16_t length = (uint16_t)(SAMPLES_TO_SEND * 2);
            notification_err_code = ble_nus_data_send(&m_nus, to_snd_buf, &length, m_conn_handle);   
            sprintf(error_string, "Error number: %#x\n", notification_err_code);
            SEGGER_RTT_WriteString(0, error_string);
            idx_snd_buf = 0;
        }
    }
}

(...)

The main point of the code is that I get and store 10 ADC samples per NRF_DRV_SAADC_EVT_DONE and ble_nus_data_send() is called only if the stored data exceeds 120 samples.

By doing this, I think both the sampling rate and transfer rate decrease–Previously I got 120 samples in one scoop and sent immediately but now I need to wait 12 timer calls to send data.

In other words, I thought that the timer request ADC to fill the buffer so the sampling rate is controlled by the timer. In my case, as I set the timer tick to 100 us, the sampling rate should be 10 kHz.

But when I test with a function generator, I think the sampling rate differs with what I think.

Also, there are comments that changing the acquisition time will affect ADC's sampling rate–"use 3us acquisition time to make use of 200 kHz sampling rate.

Is there anyone can explain me how the timer tick, ADC acquisition time, and sampling rate works?

How I can get 10 kHz or 20 kHz sampling rate?

And also, why NRF_ERROR_RESOURCES appear and keep me from making use of maximum throughput?

Parents
  • In principle this should work yes. The only problem I see is that the softdevice will be interrupting the application, and this may cause the application interrupts to be blockeds/skipped for a period of time. To speed up things it may help to look at the gpiote example in the nRF5 SDK. This show how to connect a ppi channel from a timer event to a gpiote task, this will let the timer trigger the gpiote directly without cpu intervention. In your case you would need to connect the ppi channel from the timer to the adc sampling task, thereby you should only need to handle the adc callback when the data is available. If you set the interrupt priority of adc to interrupt priority 2, and write a very short interrupt routine for instance only write the adc value to a global buffer and increment a counter, then you can let the actual processing of the adc value and transmission using ble_nus_data_send() be handled in main().

Reply
  • In principle this should work yes. The only problem I see is that the softdevice will be interrupting the application, and this may cause the application interrupts to be blockeds/skipped for a period of time. To speed up things it may help to look at the gpiote example in the nRF5 SDK. This show how to connect a ppi channel from a timer event to a gpiote task, this will let the timer trigger the gpiote directly without cpu intervention. In your case you would need to connect the ppi channel from the timer to the adc sampling task, thereby you should only need to handle the adc callback when the data is available. If you set the interrupt priority of adc to interrupt priority 2, and write a very short interrupt routine for instance only write the adc value to a global buffer and increment a counter, then you can let the actual processing of the adc value and transmission using ble_nus_data_send() be handled in main().

Children
  • Thanks for the reply, but it still confuse me...

    In your case you would need to connect the ppi channel from the timer to the adc sampling task

    As I used nrf_drv_ppi_channel_alloc(), nrf_drv_ppi_channel_assign(), and so on as I uploaded as code, I think I already connected the ppi channel to the adc sampling task. Isn't it?

    If you set the interrupt priority of adc to interrupt priority 2

    I believe this means APP_IRQ_PRIORITY_MID, which is higher than the APP_IRQ_PRIORITY_LOW I formerly used.

    then you can let the actual processing of the adc value and transmission using ble_nus_data_send() be handled in main()

    Currently, I call ble_nus_data_send() in saadc_callback() function which is assigned to the ADC via nrf_drv_saadc_init(&saadc_config, saadc_callback).

    If I move the codes to the main, will be the NRF_ERROR_RESOURCES decrease? And is there any example I can use to do that?

  • DL_November said:
    As I used nrf_drv_ppi_channel_alloc(), nrf_drv_ppi_channel_assign(), and so on as I uploaded as code, I think I already connected the ppi channel to the adc sampling task. Isn't it?

     I can't find any code in this case that show this? 

    DL_November said:
    Currently, I call ble_nus_data_send() in saadc_callback() function which is assigned to the ADC via nrf_drv_saadc_init(&saadc_config, saadc_callback).

    If you move the adc interrupt to priority 2 then svc calls to the softdevice is not possible:
    https://infocenter.nordicsemi.com/topic/sds_s140/SDS/s1xx/processor_avail_interrupt_latency/exception_mgmt_sd.html

    Using adc interrupt priority 2 will allow the adc interrupt to run more frequently. 

    DL_November said:
    If I move the codes to the main, will be the NRF_ERROR_RESOURCES decrease? And is there any example I can use to do that?

    NRF_ERROR_RESOURCES means that the buffers in the softdevice is full, the application may need to wait before filling up buffers again. It may be re-transmissions occurring that for periods will decrease the throughput for short periods, so you may need to handle this in the application by buffer data and try again in for instance 7.5ms.

  •  I can't find any code in this case that show this? 

    Sorry for my mistake. It was in the part I skipped.

    After the nrf_drv_timer_enable() in saadc_sampling_event_init(), I assigned timer to the SAADC with following code.

    uint32_t timer_compare_event_addr = nrf_drv_timer_compare_event_address_get(&TIMER_ADC,
                                                                                    NRF_TIMER_CC_CHANNEL0);
        uint32_t saadc_sample_task_addr   = nrf_drv_saadc_sample_task_get();
    
        // setup ppi channel so that timer compare event is triggering sample task in SAADC //
        err_code = nrf_drv_ppi_channel_alloc(&m_ppi_channel);
        APP_ERROR_CHECK(err_code);
    
        err_code = nrf_drv_ppi_channel_assign(m_ppi_channel,
                                              timer_compare_event_addr,
                                              saadc_sample_task_addr);

    If you move the adc interrupt to priority 2 then svc calls to the softdevice is not possible:

    Is it because ble_nus_data_send() is a kind of svc call which has priority of 4?

  • And I think maybe my question was not clear, so I'll ask again.

    Currently, my setting for SAADC is

    - Acquisition time = 3us

    - Burst mode disabled

    - Timer is assigned with 100 us tick

    - Convert buffer size is 10 samples per callback

    In this case, the SAADC begins to collect data when the timer calls.

    Hence the sampling rate is kind of 100 kHz, isn't it?

    But When I input the 1 Hz signal to the ADC, the result of bluetooth packet seems like 10 kHz sampling rate with period of 10,000 samples.

    Why the sampling rate differs from neither 200 kHz nor 100 kHz?

  • A timeout of 100us will sample 1/100us=10kHz , but you could check this yourself by add a GPIO toggling on the saadc callback to check how frequently it executes? It may be a better idea to set Acquisition time to 10us and sample it 9 times for each callback, since that would more evenly spread the sampling over the 100us?

    Best regards,
    Kenneth

Related