This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Longer Connection Interval VS Slave Latency

My question is related to peripheral device only. I have implemented an auto connection interval negotiation algorithm which works as expected and negotiates connection interval with central device (usually android phone) based on packet queue counter. But as expected, it negotiates much lesser interval with slave latency in place when compared to slave latency being "0". I want to know which option is best in terms of power saving? Having a longer connection interval or having a slave latency ? Just to be clear, I don't care about power saving on central device.

  • Hi,

    Both longer connection interval and added slave latency will save you power. How much power slave-latency will save you, depends on how often it have to actually send data to the central. We have a power-profiler here, but unfortunately it don’t support slave latency. If you are interested in power-consumption, I would recommend buying a Power Profiler Kit, and do some real tests on your boards running your application.

Related