My question is related to peripheral device only. I have implemented an auto connection interval negotiation algorithm which works as expected and negotiates connection interval with central device (usually android phone) based on packet queue counter. But as expected, it negotiates much lesser interval with slave latency in place when compared to slave latency being "0". I want to know which option is best in terms of power saving? Having a longer connection interval or having a slave latency ? Just to be clear, I don't care about power saving on central device.