This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

NRF52 Timing Accuracy

I'm developing an accurate and low drift millisecond-resolution timer on the NRF52. I'm limited to the resources on the NRF52 itself. Details of the system:

  • Initialise TIMER0 to be clocked from the main crystal (16MHz 10 ppm, using the PCA10040 to test).
  • Configure TIMER0 to generate an interrupt every 1ms, using the compare match 0.
  • Every 1000ms, a pin on the NRF52 is toggled (NRF_PPS). [EDIT: this is only to validate the timer accuracy against the GPS 1PPS]
  • The TIMER0 interrupt is set as APP_IRQ_PRIORITY_HIGH.

In order to benchmark the accuracy of the timer, I use a GPS 1PPS output (stated to be accurate to <40ns 95% of the time). The test program waits for the rising edge of the GPS_PPS. Once this is received, it starts the milli-second timer. Using an oscilloscope, one channel is triggered on the rising edge of the GPS_PPS. Another channel displays the NRF_PPS. The drift of the millisecond timer is the difference between the NRF_PPS and the GPS_PPS.

I've found that if I set the compare match value to how it should be (NRF_TIMER1->CC[0] = 1000

  • 1), the drift is 750.0 ms over 4 minutes. Meanwhile if I tweak the match value, the best I can get the drift down to is 133.4ms over 4 minutes ((NRF_TIMER1->CC[0] = 1000 - 1 - 2). This is drifting well outside the spec of the onboard crystal. A 10 ppm crystal should only drift 2.4ms over 4 minutes!

Questions:

  1. Is generating interrupts on the compare match the best way to keep a millisecond-resolution timer value?
  2. Why is the compare match value drifting so much when I set it to how it should be?
  3. Why is the timer drifting well outside of the spec of the 16MHz crystal on the PCA10040? The equivalent spec of my millisecond timer after tweaking is 3125 ppm! After tweaking it is 560 ppm (but still unacceptably high).

My application is also going to use BLE (advertising, central and peripheral connections at the same time!). From what I've read, the worst t_radio (Table 25) is 5.5ms in my use-case. Does this mean that I'll miss interrupts some 1ms timer compare interrupts? Or am I reading the s132 v4.0.2 SD specification incorrectly? What is the longest time at which interrupts with APP_IRQ_PRIORITY_HIGH level are unable to be serviced while using BLE?

Notes:

  • The port toggling in the TIMER1_IRQHandler is not likely to influence the results. I've measured the ISR to take 978ns (excluding the interrupt context switching...).
  • I could change to clock the timer from the 32kHz crystal, but would like to figure out why the current timer clocked from the XTAL is drifting so much.
  • I'm using NRF52 modules with have an onboard XTAL. I can't replace this with a TCXO but the application does not require accuracy better than 10 ppm.
  • I'm using the s132_nrf52_4.0.2 soft device. In the above test, I have not enabled the soft device, but would need to when my application is doing BLE.
  • I've done something similar with my STM32-based projects. With an ordinary 10ppm XTAL, I've observed a drift of 1ms over 4 minutes, which equates to 4.2 ppm.

Test program: www.dropbox.com/.../nrf_timer_accuracy_test.c

Related