I have two nrf51822 devices, one running s110, the other s120. The s110 device reads ADC values every 20ms, and if they changed from the last reading it sends a notification with the values to the s120 device. My connection interval is 7.5ms, so this works fine. (packet is only 9 bytes). However, I also wish to know roughly how "old" the ADC readings are when received on the s120 device. I used the ble_app_multilink example in the SDK as a basis for the project, and tried the following:
I modified it so that when the s120 enables notifications on the s110 it also starts RTC1 with a 64 prescaler. When the s110 receives this it also starts its RTC1 with the same prescaler. The idea is that the real-time clocks on the two devices are now roughly in sync, disregarding the time it takes to send and process the packet. This difference will in any case only add a constant offset. Every time the s110 sends a message to the s120 it will timestamp the message, and the s120 will check how late it is. Since the packets are timestamped, then queued, i expected the difference to be maximum one connection interval, but the values seem to be completely random, anywhere from 2 ticks, to about 35 ticks. (~4ms to ~70ms). The s120 device makes use of the ble_radio_notification to know when exactly to start the RTC, so i'm guessing it should be fairly accurate.
So, I have two questions:
-
Why do i see such large variation in the received packets?
-
Is there a better way to know how "old" the received packet is?