Has anyone experience of successful TICK_DV calibration for nRF9E5? What order of accuracy should be achievable?
In my application I maintain a software RTC clocked by the LF TICK. This RTC is periodically SYNCed with a Host RTC, and its accuracy is compared against the Host RTC.
The nRF9E5 wakes from deep sleep every 30 sec and the TICK_DV calibration is performed each time before it returns to deep sleep.
I have followed the recommendations in manual section 19.2 Tick calibration, i.e. have written code to measure the 16 bit difference between consecutive captures in SFR registers {RCAP2H,RCAP2L} and calculate TICK_DV from this.
Although the default TICK_DV is 3, giving a nominal TICK of 1 ms, instead, I set a TICK_DV of 199 to give a nominal TICK of 50ms. I chose this because single bit error in 199 (resulting from rounding in calculations) is less severe than a single bit error in 3.
However, in order to get anything approaching satisfactory RTC performance I have to make an empirical adjustment to the calculated value of TICK_DV . I also report the calculated value in RF packets: over the last few hours I am seeing values of 169 .. 174.
The best result I have achieved so far is 30 sec gain in 4 hours; but when this firmware runs on another batch of devices (built on a different PCB) they lose approx 2 min 45 sec over a 4 hour period.
Clearly these results are disappointing.
Suggestions welcome.
Damien