Hello, I was wondering why I am not getting a theoretical sampling rate as I set them to with a timer.
I am using PPI to trigger a SAADC sampling with the timer to set to 8ms.
However, I am not getting a sampling rate with 125Hz. My Connection intervals for BT are Min = 20 and Max=30.
Also, I am using 2 buffers to ensure that sampling happens even if other higher priority events than saadc_callback
uint32_t ticks = nrf_drv_timer_ms_to_ticks(&m_timer, 8);
nrf_drv_timer_extended_compare(&m_timer, NRF_TIMER_CC_CHANNEL0, ticks, NRF_TIMER_SHORT_COMPARE0_CLEAR_MASK, false);
nrf_drv_timer_enable(&m_timer);
uint32_t timer_compare_event_addr = nrf_drv_timer_compare_event_address_get(&m_timer, NRF_TIMER_CC_CHANNEL0);
uint32_t saadc_sample_event_addr = nrf_drv_saadc_sample_task_get();
/* setup ppi channel so that timer compare event is triggering sample task in SAADC */
err_code = nrf_drv_ppi_channel_alloc(&m_ppi_channel);
APP_ERROR_CHECK(err_code);
err_code = nrf_drv_ppi_channel_assign(m_ppi_channel, timer_compare_event_addr, saadc_sample_event_addr);
APP_ERROR_CHECK(err_code);
I am wondering if there are any solutions for the inconsistency of the sampling rate or a way I can calculate the inconsistency and optimize for it. My project highly depends on constant sampling rate and having no blind spots for sampling.
Thank you, John