Hello, I am developing with: custom nRF52810 board, nRF5 SDK 17, S112.
I'm using APP_TIMER for my main tick. I have the following code I use to initialize and start the timer.
APP_TIMER_DEF(main_tick_id); uint32_t err_code = app_timer_init(); APP_ERROR_CHECK(err_code); err_code = app_timer_create(&main_tick_id, APP_TIMER_MODE_REPEATED, timer_handler); APP_ERROR_CHECK(err_code); err_code = app_timer_start(main_tick_id, APP_TIMER_TICKS(1), NULL); APP_ERROR_CHECK(err_code);
From what I can tell, "1" in APP_TIMER_TICKS(1) is in ms. However, when I toggle a pin on my main loop waiting for the tick and observe it on an oscilloscope, it appears that the timer is in 1024Hz not 1kHz.
- Is my implementation of the 1ms App Timer not correct to generate a 1ms tick?
- If my implementation is correct, is there a way for me to change my app timer to actually run at 1kHz not 1024Hz?
I'm curious as to why the parameter description says milliseconds when it's not exactly 1ms tick.
Thank you in advance.