I'm trying to use slave latency to reduce current consumption of my device.
Connection parameters defined in `main.c`:
#define MIN_CONN_INTERVAL MSEC_TO_UNITS(15, UNIT_1_25_MS)
#define MAX_CONN_INTERVAL MSEC_TO_UNITS(30, UNIT_1_25_MS)
#define SLAVE_LATENCY 4
#define CONN_SUP_TIMEOUT MSEC_TO_UNITS(4000, UNIT_10_MS)
The device I'm connecting to is running iOS and I'm well within its requirements:
Slave Latency ≤ 30
2 seconds ≤ connSupervisionTimeout ≤ 6 seconds
Interval Min modulo 15 ms == 0
Interval Min ≥ 15 ms
Interval Min + 15 ms ≤ Interval Max
Interval Max * (Slave Latency + 1) ≤ 2 seconds
Interval Max * (Slave Latency + 1) * 3 < connSupervisionTimeout
To figure out the actual connection parameters I've put breakpoints after all calls to `sd_ble_gap_conn_param_update()` in my project, but none of them are visited even after the module connects to the master.
The problem is I'm not seeing any effects of slave latency on the current consumption profile (see image below). Here, there are regular pulses every 30 ms (connection interval). What I expected to see is a pulse every 4 (±1) * 30 ms since there's no data to be transmitted otherwise.
Why am I not seeing reduced consumption with the use of slave latency?
I'm using SDK 11.