Hello,
While working with a bare metal (no SoftDevice or OS) test application to prove code functionalities, I ran out in a weird behaviour that I cannot explain.
I'm using a HW Timer (NRFX_TIMER) to trigger periodically some routines.
Since I wanted to prove the periodicity of the operation, I re-used an RTC that I init and start at the boot to count the ticks difference in a period.
We all know the max resolution of RTC is just 30.5us, so I expected some error, but I've actually found out that the difference I see is around 3ms every 100ms, which is impossible to happen with a working configuration.
I'm sure that this is a problem of how I'm configuring stuff, so I reproduced the problem on a dev kit (PCA10040).
I took as a base the timer peripheral application, which is using a Timer (https://infocenter.nordicsemi.com/topic/sdk_nrf5_v16.0.0/nrf_dev_timer_example.html) to blink LEDs.
I modified it in the below points:
Function timer_led_event_handler:
/** * @brief Handler for timer events. */ void timer_led_event_handler(nrf_timer_event_t event_type, void* p_context) { static uint32_t prev = 0; uint32_t now = 0; uint32_t diff = 0; static uint32_t i; uint32_t led_to_invert = ((i++) % LEDS_NUMBER); switch (event_type) { case NRF_TIMER_EVENT_COMPARE0: now = app_timer_cnt_get(); diff = app_timer_cnt_diff_compute(now, prev); NRF_LOG_INFO("Ticks Elapsed %d", diff); NRF_LOG_FLUSH(); prev = now; bsp_board_led_invert(led_to_invert); break; default: //Do nothing. break; } }
Then in the main:
* @brief Function for main application entry. */ int main(void) { uint32_t time_ms = 1000; //Time(in miliseconds) between consecutive compare events. uint32_t time_ticks; uint32_t err_code = NRF_SUCCESS; //Configure all leds on board. bsp_board_init(BSP_INIT_LEDS); err_code = NRF_LOG_INIT(NULL); /* Create logging task if no errors */ if (NRF_SUCCESS == err_code) { NRF_LOG_DEFAULT_BACKENDS_INIT(); } err_code = nrf_drv_clock_init(); nrf_drv_clock_lfclk_request(NULL); //while(false == nrfx_clock_lfclk_is_running()){}; err_code = app_timer_init(); err_code = app_timer_create(&executionTimer, APP_TIMER_MODE_SINGLE_SHOT, AppTimerTimeout_Cback); if (NRF_SUCCESS == err_code) { err_code = app_timer_start(executionTimer, TEST_TIMEOUT, NULL); } NRF_LOG_INFO("Test started"); NRF_LOG_FLUSH(); ... }
Of course I also included in the project all the dependencies needed to compile and modified sdk_config on purpose.
If I set time_ms to 100ms, I see as output:
00> <info> app: Ticks Elapsed 3230 00> 00> <info> app: Ticks Elapsed 3230 00> 00> <info> app: Ticks Elapsed 3230 00> 00> <info> app: Ticks Elapsed 3229 00> 00> <info> app: Ticks Elapsed 3230 00> 00> <info> app: Ticks Elapsed 3229 00> 00> <info> app: Ticks Elapsed 3229 00> 00> <info> app: Ticks Elapsed 3230
So elapsed time is around 98.5ms.
With 500ms:
00> <info> app: Ticks Elapsed 16151 00> 00> <info> app: Ticks Elapsed 16148 00> 00> <info> app: Ticks Elapsed 16154 00> 00> <info> app: Ticks Elapsed 16156 00> 00> <info> app: Ticks Elapsed 16155 00> 00> <info> app: Ticks Elapsed 16153 00> 00> <info> app: Ticks Elapsed 16152 00> 00> <info> app: Ticks Elapsed 16159 00> 00> <info> app: Ticks Elapsed 16160
So elapsed time is around 493ms.
This is continuing linearly, since when time_ms is 1000 the elapsed time detected with RTC is around 986ms.
It may be something very stupid, but may I ask if someone understands where the problem may be?
Information about the system:
- running on nRF52 Dev Kit (PCA10040)
- RTC frequency is 32768 Hz
- CLOCK_CONFIG_LF_SRC/NRFX_CLOCK_CONFIG_LF_SRC is set to XTAL
- NRFX_TIMER_DEFAULT_CONFIG_FREQUENCY is 16MHZ, with NRFX_TIMER_DEFAULT_CONFIG_BIT_WIDTH of 16bit
- All interrupts are using priority 6 (default) - since this is the only thing happening in the system I doubt that this is affecting anything
Please let me know if you need any more information to investigate this fully.
Thanks!
D