Hi,
- We are using SD110 with SDK 7.2.0 in our application
- Our application needs to be quite accurate in timing
- We set out to determine what the accuracy of the TickTimer was, which is provided by the SD (110 in our case)
- Our hardwaresetup is shown in the attached file, "nRF51TimerTickAccuracyTestSetup.pdf"
- What you see is a 1MHz oscillator (accurate to a few PPM)
- it is connected to a 12-stage counter (hardware, so NO time lost due to freq. division)
- I take the 12th stage output (which is 1MHz / 2^12 => 244 Hz) and connect it to P0.0 on the nRF
- What you see is a 1MHz oscillator (accurate to a few PPM)
- The code (shown in file "nRF51TimerTickAccuracyTest.pdf") does the following:
- Set up GPIOTE for my input pin (P0.0)
- Init Timer2 to be a counter in 16 bit mode
- Configure the PPI channel 0 to get Events in from P0.0 (NRF_PPI->CH[0].EEP)
- Configure the PPI channel 0 to count up on T2 (NRF_PPI->CH[0].TEP)
- Set up GPIOTE for my input pin (P0.0)
- All this works very nicely, thanks to your previous help!!
- I run the nRF without ANY BLE activity at all
- This should mean that the SD is quite undisturbed by asynchronous events like BLE traffic, etc.
- I expect to see CC[0] get a value close to 14,604:
- every sec is 1,000,000 / 2^12 = 244 counts / second
- 244 * 60 seconds = 14,604 counts / minute
- HOWEVER, what I actually see in the CC[0] register (using the Keil debugger) is 14,751 counts
- This is the equivalent of 420,000 counts difference between expected and actual
- this is 0.42 seconds gained every minute!
- Adding BLE doesn't change this number at all
- My question is:
- What is causing this 420mSec per minute increase?
- BTW, this is constant across many boards...
- Is there a way to fine-tune the timer such that it is more accurate than this?
Files attached
Thanks for your help!