This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Best timing accuracy between iOS device and peripheral device

Hi, I'd like to know how tp estimate of the best "timing accuracy" I could expect to achieve between an iOS device and a peripheral device. The objective is to make a specific thing happen at the same time on these two devices.

Using BLE, one approach to achieve the timing would be to let the iOS device control timing by writing a characteristic (without response). I.e., the iOS device initiates the write at some specific time, and the peripheral device initates it's response as soon as it receives/sees the written characteristic.

With a connection intervall of 20 ms, and ignoring other details, I think we'd get an average timing error of 10 ms, and a worst case error of 20 ms.

I'd like to achieve better timing than this, any ideas? /Christian

Note: I've read the question/answer for How do I calculate througput for a BLE link - it's very instructive, but I think not relevant for this.

  • Hi,

    You will inevitably experience multiple delays from the initial call from the tx side application to when the message is received at the rx side application. You may also experience large differences in the delay (jitter).

    You will have roughly the following sources for delay:

    1. Preparing the packet
    2. Waiting for transmission
    3. Transmission time
    4. Propagation delay
    5. Reception operations
    6. Propagating message to application

    Points 1, 5 and 6 may take several ms on a smartphone, depending on system state. It varies from system to system, and I do not have the numbers for this. It would probably be best to test it yourself on the smartphone you are using. If you do it would be great if you could share your findings. Point 2 is on average half the connection interval. Point 3 can be calculated from BLE transmission rate and packet size. Point 4 is negligible due to the short range of BLE (signal propagates 10 meters in ~33 ns).

    The initial suggestion was to let the smartphone write a characteristic on the nRF device. Using 1.5 ms as a rough estimate for the sum of transmission time and nRF side delays, and 10 ms average for waiting for transmission (20 ms connection interval), one would expect a delay of x + 11.5 ms, where x is the smartphone side delay. The jitter would be substantial (± 10 ms from connection interval, in addition to smartphone and nRF stack jitter.)

    If the smartphone API signals the application when the BLE packet is sent, you can do better. Instead of points 1-3 adding to the delay and jitter, you will get a delay for propagating the signal of "packet sent" to the smartphone app. This would happen concurrently with reception related operations and propagation to application on the nRF side. In this scenario, time used for notifying the application on the nRF side (~1 ms range) is probably shorter than that on the smartphone side, meaning the nRF is triggered a few ms before the application on the smartphone. In any case you have eliminated the connection interval related delay and jitter.

    Depending on your needs and on the application, you may have a look at algorithms for clock synchronisation similar to the one used for Network Time Protocol (NTP). The basic idea is packet exchanges back and forth between devices A and B, that are timestamped using the RX side clock on RX and the TX side clock on TX. The delay is assumed to behave the same in both directions, and an estimate is calculated for the time difference between the two clocks. When the clocks are synchronised, the devices can perform simultaneous actions. The synchronisation process is kept running (using a longer interval) to keep the clocks from drifting out of sync. The accuracy will often be high compared to network delay and jitter.

    For other options, I suggest that you have a look at these other related questions:

    Regards, Terje

Related