Hi All,
wondering if anyone else has experienced this. I'm currently developing a program based off of the ble_hids example in SDK17. Currently I'm using the app_timer to give me a repeated trigger every 20ms. This seems to be working well based off of pin scoping.
Occasionally, I require a 10ms "delay" or "timer". Instead of formally making a timer however, I was planning to just get the current tick count with
app_timer_cnt_get()
and then adding the number of ticks for 10ms with
APP_TIMER_TICKS(10)
After this, I basically enter a dumb while loop that only exits once app_timer_cnt_get() elapses the previously calculated value. I thought this should be simple enough... however I'm finding that when I do this, I end up with an additional 1ms. So instead of 10ms, I end up with about 11ms... The logic of what I'm doing seems okay... so I'm wondering if anyone might have any ideas where that extra ms is coming from. Like I said initially, when using the timer, I seem to get a fairly accurate result. But when counting the ticks, I'm not getting quite what I was expecting.
If it's of any value, I have the prescaler APP_TIMER_CONFIG_RTC_FREQUENCY set to 32768Hz.