This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Bluetooth connection parameters

I am working with nRFx modules and have a few questions about updating parameters. I started my project with an example and modified what I needed. However, I am not sure if the parameters are getting set when I change them. Currently I go into main.c and change the following parameters:

#define MIN_CONN_INTERVAL                          MSEC_TO_UNITS(20, UNIT_1_25_MS)           
#define MAX_CONN_INTERVAL                         MSEC_TO_UNITS(75, UNIT_1_25_MS)           
#define SLAVE_LATENCY                                  0                                         
#define CONN_SUP_TIMEOUT               
#define FIRST_CONN_PARAMS_UPDATE_DELAY  APP_TIMER_TICKS(5000, APP_TIMER_PRESCALER)  
#define NEXT_CONN_PARAMS_UPDATE_DELAY   APP_TIMER_TICKS(30000, APP_TIMER_PRESCALER) 
#define MAX_CONN_PARAMS_UPDATE_COUNT    3                                
#define TX_POWER_LEVEL                                4      

If I change these parameters, specifically the first three (connection interval, slave latency, and connection supervision timeout), will they automatically be set on with the central? I am currently connecting to a windows 10 device and trying to figure out if the parameters are being updated and if not, why? How would I check to see if they are getting updated?

Parents
  • What project you use? There are like hundred main.c files in SDK;) Normally these constants are indeed used in the code when you recompile and load it to the board so they should be used. The question is what you expect to happen;) Normally there is no debugging on Link Layer on systems like smart phones or Windows boxes so the usual way is to get RF analyzer/sniffer and see what is happening in the air.

Reply
  • What project you use? There are like hundred main.c files in SDK;) Normally these constants are indeed used in the code when you recompile and load it to the board so they should be used. The question is what you expect to happen;) Normally there is no debugging on Link Layer on systems like smart phones or Windows boxes so the usual way is to get RF analyzer/sniffer and see what is happening in the air.

Children
  • The project that was used can be found in the SDK under, examples\ble_peripheral\ble_app_uart. So if I were to modify those parameters, do they get sent to the central and used when the connection is first made? What do you suggest is the best way to figure out whether or not the parameters I am setting are actually being used?

  • No, they are not used when the connection starts because there is simply no way how would Master (Central) know these. They are not broadcasted (no space). However Master can (and should) read them is Peripheral Preferred Parameters Characteristic and Peripheral also can use Peripheral Connection Parameters Update procedure.

    I'm afraid the problem here is the perception which leads you to wrong questions. I guess it's safe to say that these parameters are used but that's not saying that you can automatically achieve what you want. Which brigs us to the right question (from my point of view;): what do you want exactly to achieve? Do you want to force Master (Central) to certain connection parameters? Do you want to increase bandwidth? To lower latency? Do you want to achieve that with every peer (another nRFx board, phones, PCs...)? Maybe you fiddle with the thing which won't help you.

  • oh, i was thinking that those parameters were told to the central at first connection. Could you help me understand the peripheral connection parameters update procedure? This is where I would use FIRST_CONN_PARAMS_UPDATE_DELAY and NEXT_CONN_PARAMS_UPDATE_DELAY, correct?

    I would like to try and force the master to accept the parameters so that I can play around and see what parameters will help me experience bluetooth disconnects less often. As it stands I have a central that communicates with two peripherals that tend to move around a bit in the room. I transmit and expect data back every 300 ms or so. Bandwidth is not an issue as I am not sending/receiving a lot of data. I was thinking that I could lower the MIN_CONN_INTERVAL and MAX_CONN_INTERVAL settings and increase the CONN_SUP_TIMEOUT (so that i get less bluetooth disconnects from the windows bluetooth stack).

  • I am curious, are the min and max conn intervals how often the central will send a keep alive signal?

  • There is only one interval in BLE link and that's connection interval. Even if no communication happens there are two "empty" PDUs exchanged every interval (unless there is master or slave latency allowed and device takes the opportunity to miss certain interval and follow up later). The main factor for connection loss (if your device stack is implemented correctly!!!) is supervision time-out which cannot be lower then slave latency and simply once any side of the link doesn't receive any PDU ack in that time it should terminate connection immediately (and because there is no one to signal it over the radio it's rather internal process inside the stack and signalling to upper layers if applicable).

    Your strategy of lowering connection interval makes sense because it should mean more chances to exchange valid PDU pair which will reset supervision timer. Also lower interval... (1/2)

Related