This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

RTC system time inaccuracy

I have a 1 second app_timer running which increments a 1 second variable used as unix time. We have a 32Khz Xtal, etc, which should result in good accuracy.

I have noticed drift of several seconds per day over several devices just sitting on a shelf. Its similar to what was reported in this question (but there is a lot to understand there): devzone.nordicsemi.com/.../

It seems that the app_timer library can be responsible for this as described in that thread. (I also have a 100 msec app_timer for finer resolution of timing events.)

We were hoping for 1 minute per month accuracy. Under these conditions, assuming the xtal is properly accurate, is it possible using the methods described in that thread for example something like:

http://pastie.org/10492786#2

to result in this sort of accuracy?

  • Before overcomplicating the code you should understand why it is the timer isn't accurate. Are you getting too long intervals between ticks (or too short) or are you missing ticks now and again? Do you have the timer set up to be a repeating timer, or are you rescheduling the next 100ms after the expiration of the previous one?

    Nor do you need to use app_timer at all if you just want something as simple as a clock. You don't say whether you're using nRF51 or nRF52, the latter has an extra RTC you could use with a large prescalar to count slowly, then all you have to do is deal with overflow and you have an accurate clock.

Related