This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Slow rate from central to peripheral

Hello,

I'm working on nrF52832 with two different Android device : Samsung Galaxy note 3 neo and HTC ONE M8.

My test :

  • Central side ( smartphone ) : each 200 ms I send 20 Bytes during 20s -> 100 * 20 Bytes
  • Peripheral side : I measure the time between the first packet and the last one.

The result :

  • HTC ONE M8 : 20.2 s to receive data
  • SAMSUNG = 43 s to receive data

We are trying to improve the data transmission because the rate is too slow on some smartphone ( ex : samsung note 3 neo ).

I think that the connection interval on ble_evt_connected return 39 -> 48,5ms ( for both smartphone )

#define MIN_CONN_INTERVAL                MSEC_TO_UNITS(10, UNIT_1_25_MS)           /**< Minimum acceptable connection interval (0.1 seconds). */
#define MAX_CONN_INTERVAL                MSEC_TO_UNITS(200, UNIT_1_25_MS)           /**< Maximum acceptable connection interval (0.2 second). 200*/
#define SLAVE_LATENCY                    0                                          /**< Slave latency. */
#define CONN_SUP_TIMEOUT                 MSEC_TO_UNITS(4000, UNIT_10_MS)            /**< Connection supervisory timeout (4 seconds). */

How can I increase the speed ?

Thank you very much.

Nabil

  • Key is what really happens in the air. Do you have any BLE sniffer to have a look? If you really transport only 20B every 200ms then there should be no excuse to have bandwidth of 20B per 400ms, all Android phones should have much lower connection interval than that and each connection even can transfer at least 20B on top of GATT. It almost looks like some application problem (meaning that you have bug in the app or protocol is having too much overhead leading to such poor performance).

  • Hello,

    Thank you for your answer.

    I just have changed :

    #define MIN_CONN_INTERVAL                MSEC_TO_UNITS(10, UNIT_1_25_MS)           /**< Minimum acceptable connection interval (0.1 seconds). */
    #define MAX_CONN_INTERVAL                MSEC_TO_UNITS(200, UNIT_1_25_MS)           /**< Maximum acceptable connection interval (0.2 second). 200*/
    #define SLAVE_LATENCY                    0                                          /**< Slave latency. */
    #define CONN_SUP_TIMEOUT                 MSEC_TO_UNITS(4000, UNIT_10_MS)            /**< Connection supervisory timeout (4 seconds). */
    

    to :

    #define MIN_CONN_INTERVAL                MSEC_TO_UNITS(7.5, UNIT_1_25_MS)           /**< Minimum acceptable connection interval (0.1 seconds). */
    #define MAX_CONN_INTERVAL                MSEC_TO_UNITS(200, UNIT_1_25_MS)           /**< Maximum acceptable connection interval (0.2 second). 200*/
    #define SLAVE_LATENCY                    0                                          /**< Slave latency. */
    #define CONN_SUP_TIMEOUT                 MSEC_TO_UNITS(4000, UNIT_10_MS)            /**< Connection supervisory timeout (4 seconds). */
    

    and it seems to work correctly. Is it normal ? The chosen connection interval seems to be 39 ..

  • No, it's not usual or normal but on the other hand you can expect anything with Android phones;) Also what do you mean by "connection interval seems to be 39"? Is it native time unit meaning 1.25*39=48.75ms? This actually is default connection interval used by majority of Android devices since Android 5.0, they typically ignore peripheral preferred connection parameters (which you play with). If you have sniffer you can simply get few traces of expected behavior, unexpected behavior and then expected behavior with modified minimal connection interval. From the flow you should easily spot the differences and then guess what is running in mind of your android phone;)

  • case BLE_GAP_EVT_CONNECTED:

                        NRF_LOG_PRINTF(" central connected \r\n");
                       
                    err_code = bsp_indication_set(BSP_INDICATE_CONNECTED);
                    APP_ERROR_CHECK(err_code);
                    m_conn_handle = p_ble_evt->evt.gap_evt.conn_handle;
        
        
                    // return 39 for min and max
                	NRF_LOG_PRINTF("max_conn_interval : %d \r\n",p_ble_evt->evt.gap_evt.params.connected.conn_params.max_conn_interval);
                	NRF_LOG_PRINTF("min_conn_interval : %d \r\n",p_ble_evt->evt.gap_evt.params.connected.conn_params.min_conn_interval);
        
                    break; 
                
                
        
                case SD_BLE_GAP_CONN_PARAM_UPDATE : // never occurs 
                	NRF_LOG_PRINTF("max_conn_interval : %d \r\n",p_ble_evt->evt.gap_evt.params.conn_param_update.conn_params.max_conn_interval);
                	NRF_LOG_PRINTF("min_conn_interval : %d \r\n",p_ble_evt->evt.gap_evt.params.conn_param_update.conn_params.min_conn_interval);
        
                break;
    

    It's on this function. 39 mean 48.75 ms but I don't understand what is the real chosen connection interval ..

    Thank you for your help :)

  • Oh if event data on BLE_GAP_EVT_CONNECTED say that it's 39 (= 48.75ms) then it makes sense. Still if this is the same for both "OK" and "NOK" test cases then the truth is visible on radio, without looking into that it might be like debugging black box indirectly...

Related