This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Power Consumption: best connection interval

Hi,

Let's assume that we know there is a packet ready to be sent every 20ms. Now what is the best connection interval from power consumption point of view? Here is my understanding:

1- the connection interval should not be smaller than 20ms, since it would create a need to establish a connection in a faster rate and therefore there might be some connections in which no data is ready to be sent.

2- If set to 60ms, there will 3 connections per connection interval. This is more optimized than 20ms connection interval (and one connection per interval), right? At least the HFXO ramp and start phase of BLE will be skipped for the last two connections in an interval.

Vala

  • Hi Vala,

    In my opinion:

    • If this is your limit scenario (long time acquiring data at 20ms intervals until the end of the universe or your battery life-time) then yes, grouping data to single packets (by extending PDU and ATT_MTU lengths) and single connection intervals (transporting more then 20 bytes of data at once) will save you power for stack preparation/ending of the interval. If your peers support larger packets or more packets per interval you can go easily beyond 60ms.

    • However if your use case is "period of time when device acquire data and transport them to some base-station/peer and period of deep sleep" it might change and you might want to transport data as fast as possible to leverage much lower consumption during POWER OFF or similar power saving mode.

    It seems you are closer to first situation but it's good to understand that second situation can happen and then surprisingly using the shortest advertising interval (which should cause faster reaction of scanner) and (reasonably) short connection interval leads to more power efficient solution.

    Cheers Jan

  • Hi Jan,

    Thanks for your comment. If I understood correctly, the two cases that you mentioned imply the real-time versus offline applications. By offline I mean that the measurements are executed, the samples are stored in a buffer and finally the buffer is sent to a peer. Yes, I agree that in this case the faster connection rate with the maximum allowable number of connections per connection interval would lead to a better power consumption scenario.

    As you mentioned, my application is a real-time application is closer to the first item you mentioned.

  • Yes, more or less it's (permanent) on-line vs. off-line ("collect and transfer later") problem. If I can add my humble 4-year experience with various BLE embedded devices: these theoretical disputations about power consumption are not worth days of your time. If you mean it seriously then you need to do few PoC FW variants and measure it on your real target board. If you don't have couple of weeks for that kind of optimization then choose the one which looks the best on drawing board after few hours of brainstorming, close your eyes and prey;)

Related