Hello, this might be a silly question, but i've seen various Nordic examples and i don't know whats the difference?
#define MIN_CONN_INTERVAL MSEC_TO_UNITS(500, UNIT_1_25_MS) /**< Minimum acceptable connection interval (0.5 seconds). */
#define CONN_SUP_TIMEOUT MSEC_TO_UNITS(4000, UNIT_10_MS) /**< Connection supervisory timeout (4 seconds). */
#define NON_CONNECTABLE_ADV_INTERVAL MSEC_TO_UNITS(100, UNIT_0_625_MS) /**< The advertising interval for non-connectable advertisement (100 ms). */
In all these examples above, first parameter to the macro is number of milliseconds, and the second parameter is the resolution. By changing the resolution, we change the number of ticks. But what is the difference? Which part of the software or hardware ar using that number of ticks? So my question comes down to: what is the difference between these statements, which one is better and will in all three cases MIN_CONN_INTERVAL will be exactly 500ms?
#define MIN_CONN_INTERVAL1 MSEC_TO_UNITS(500, UNIT_0_625_MS)
#define MIN_CONN_INTERVAL2 MSEC_TO_UNITS(500, UNIT_1_25_MS)
#define MIN_CONN_INTERVAL3 MSEC_TO_UNITS(500, UNIT_10_MS)