I'm using nrfgo studio to try and design the connection parameters for an application, and in particular using the current consumption tab to get estimates of power consumption for different settings. Playing with slave latency and connection interval, I would expect to be able to get similar power consumption as long as the product of slave latency and connection interval are the same -- in other words as long, as long as the nrf8001 is able to delay a connection event for the same amount of time. However, what I'm seeing is that while higher slave latency does reduce power consumption, higher connection intervals reduce consumption much more.
For example, a slave latency of 0 and connection interval of 500ms yields an average consumption of ~26uA, while a slave latency of 10 and connection interval of 50ms yields an average consumption of ~105uA.
Am I misunderstanding how slave latency works? Or is nrfgo accounting for slave latency in some different way?
Thanks!