I am trying to measure and characterize expected battery lifetime of my design. I've noticed that as the effective Connection Interval goes up, so does the Rx current peak duration. At a 1.8sec connection interval, the RX duration is over 600usec (sometimes up to 1100usec!) . This is very power consuming. This is during an idle connection when no data should be exchanged. I am connected to the a PCA10000 acting as the central. Is this around expected behaviour during slow connection parameters?
Also in my measurements it seems the chip goes to stand-by for about 200usec between Advertising Peaks (after Rx post processing, before next Radio Start.) This is contrary to the profile shown in the S110 documentation. Can anyone comment? Thanks
It appears after digging further into this that there is a correlation between connection parameters and Rx windows. Which make sense to account for variations in crystals between two devices. I just would like to verify that a +700usec Rx window is expected with a 1.98sec connection interval. Thanks
If you are using master emulator please check the xtal accuracy, may be the master xtal accuracy is 250 ppm. Both master and slave accuracies contribute towards the window widening.
It is the inaccuracy of both the master (central) and slave (peripheral) low frequency clocks that contribute to the size of the RX window before a BLE event. BLE is a synchronous protocol, meaning that master and slave know when connection events (keep alive events) occur. Master has its low frequency clock set to know when to transmit and the slave has its low frequency clock set to know when to listen for the master packet transmission. However, both the clocks of the slave and the master are not perfectly accurate, so the slave must account for clock drift of both master and slave clocks. The slave will therefore on average turn on the receiver some time before master transmits its packet, this is called the receiver window (or window widening) of the slave. The longer the connection interval, the longer the slave has to keep the receiver on before receiving a packet.
What matters is the accuracy of the 32kHz clock source that you are using on the peripheral. You select the 32kHz clock (low frequency clock, lfclk) when you initialize the softdevice with call to SOFTDEVICE_HANDLER_INIT() function. The internal 32kHz RC has accuracy of 250ppm while crystals can be as low as 20ppm. You will therefore see lower current consumption, 1uA-2uA, when using the crystal intead of the RC, as a result of smaller RX window before the BLE event.
See also this thread for futher details.
I would say that 700us window widening is normal if you are using the internal 32kHz RC or if the master side is using 250ppm clock. Is that the case? It should be smaller if using high accuracy 32kHz crystal.
I am using the Dev Kit, and initialize with the low frequency crystal, SOFTDEVICE_HANDLER_INIT(NRF_CLOCK_LFCLKSRC_XTAL_20_PPM, false);
The PCA10000 USB dongle is acting as the central, which also uses has an external low frequency crystal rated at +/- 20ppm. Therefore it seems my Rx window should be much smaller. Correct?
I am using the PCA10000 dongle as the central. It has an external 32.768kHz crystal with an accuracy of +/-20ppm. Coupled with the accuracy of the Dev Kit board (PCA100004) I believe the Rx window should be smaller than 700usec, unless there is another factor contributing to this that I have not considered.