This question is similar to this thread: devzone.nordicsemi.com/.../ However it was framed differently and didn't really clear anything up for me.
Say my application requires a slave to send very low latency, potentially very infrequent messages.
I would want to:
Minimise Connection Interval: 7.5ms
Maximise Slave Latency: 499
So the slave would only "need" to be active every 3.75s
The slave "must" receive from the master every 500th interval, but what if it doesn't? Am I right in thinking the link would not be lost until Supervision Timeout occurs? If so, and Supervision Timeout >> 500*Connection Interval (eg 32s) what is the effect of Slave Latency?