This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

"Best method" for avoiding overflow of app_timer

I will use the functions in app_timer to set a timer to fire every 3 seconds. I want this to occur "forever" (e.g.until a restart of the nRF51).

To avoid a timer overflow, is it a better practice to set the PRESCALAR to 0 and clear the timer after each interrupt, or set the PRESCALAR to a higher value and check within the interrupt for a near overflow and clear the timer at that point? If one is better than the other - why is that so?

(thank you)

  • With prescaler 0, app_timer can have best counter resolution and can count upto 512 seconds(before it overflows). So if your application can keep PRESCALER 0 and clear the timer with the hardware SHORT when it reaches the value in CC (as this has no dependency on interrupt priority or softdevice or anythings else, this is guarenteed to happen with a predictable maximum of one tick latency), then it is the best solution. Ofcourse the second best is to clear the timer when the CC interrupt occurs (which depends on interrupt priority and softdevice state. The latency here is unpredictable).

    EDIT

    [deleted content]

  • Thank you. re: tolerance of error when selecting PRESCALER. I am not sure I understand correctly. another post notes the accuracy of the RC oscillator = 250ppm. Is this correct: When the PRESCALER is 0, the time elapsed before overflow is 512s. Given the 250µs drift/s, the clock will drift 512 x .00025 = .128s. When the PRESCALER is 4095, the time elapsed before overflow is 2097152s, the clock drift will be 2097152s x .00025 ~= 524s or ~8.73 minutes. ? If it is correct, how do i correct for this drift? Also - if this is correct, I am confused regarding the PRESCALER being the culprit for the error tolerance instead of just the tick/oscillator?

  • The JITTER over the period of time (say 1 second) stays the same with any prescaler. When I said "Higher the PRESCALER, higher would be its JITTER per tick" it is per tick JITTER that varies as obviously tick period has changed. I can see how my explanation could lead to confusion. I should have just said that it is more sampling per lower prescaler. Do you prefer if I edit my text?

  • I clearly did not understand. I was looking at RC oscillator accuracy. You were discussing the oscillator not being in sync with the PCLK16M (as noted in 19.1.8 of the reference manual). I apologize for my denseness, however the ref manual notes the variability in per tick JITTER is based on: "...as long as it takes for the peripheral to clock a falling edge and rising of the LFCLK." This reads to me that the PRESCALER is not involved. I can see each JITTER varying (and on top of that depending if task or event the amount of jitter time / tick is different)...however, isn't the PRESCALER just lumping ticks?

  • Yes it is. I am mixing things up. Sorry for confusion.

Related