This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

How to check when the stack is ready to send notification?

Hi all,

I'm using two nRF51 DK to measure ADC input on peripheral side and translate it to central, based on Multilink example. One software timer is used to perform one ADC conversion every 1 sec, and send a notification to the central throug sd_ble_gatts_hvx(). Currently timer started and stopped by button press, I need to change it:

  • start timer, when connection is done AND notification is properly enabled;
  • stop timer, when notification disabled or connection is done.

What BLE events or functions should be used to get this done?

Currently I check:

  1. (m_conn_handle != BLE_CONN_HANDLE_INVALID) in timer handler - it doen's works when connection is done, handle still has the value

  2. err_code = sd_ble_gatts_hvx(m_conn_handle, &hvx_params);
     if (err_code != NRF_ERROR_INVALID_STATE && err_code !=     BLE_ERROR_GATTS_SYS_ATTR_MISSING)         
     {
         APP_ERROR_CHECK(err_code);
     }
    
  • Since the central is off, the packets are not leaving the TX buffer. Adding more buffers will not help you. In your case, this error can be viewed as a symptom of a disconnect. The disconnect event will not happen before a certain amount of connection intervals have passed (this time is often defined as CONN_SUP_TIMEOUT in our SDK). You can ignore the error if you want, or you can pause the sending of packets until you receive a BLE_EVT_TX_COMPLETE

  • Andre, thanks a lot. Also, I receive the same error when trying to set faster send period. If I try to send packets frequenly than 100 ms, the same error received. Why this happens if central settings is:

    MIN_CONNECTION_INTERVAL          MSEC_TO_UNITS(30, UNIT_1_25_MS)                
    MAX_CONNECTION_INTERVAL          MSEC_TO_UNITS(60, UNIT_1_25_MS)                
    SLAVE_LATENCY                    0                                              
    SUPERVISION_TIMEOUT              MSEC_TO_UNITS(4000, UNIT_10_MS)
    
  • If you put more packets into the TX_buffer than you send out, it fill naturally fill up. The peripheral is able to send a certain amount of packets each connection interval (6 for S110). With a connection interval of 60ms, it can send (6 * (1/0.060) = 100) packets each second. Even though this is a theoretical maximum, it should be more than enough to send every 100 ms. (See this post for more info on BLE throughput). What is the connection intervals on your peripheral application?

  • Hi, Andre. Currently slave settings is:

    #define MIN_CONN_INTERVAL                   MSEC_TO_UNITS(30, UNIT_1_25_MS)           /**< Minimum acceptable connection interval . */
    #define MAX_CONN_INTERVAL                   MSEC_TO_UNITS(1000, UNIT_1_25_MS)          /**< Maximum acceptable connection interval (1 second). */
    #define SLAVE_LATENCY                       0                                          /**< Slave latency. */
    #define CONN_SUP_TIMEOUT                    MSEC_TO_UNITS(4000, UNIT_10_MS)            /**< Connection supervisory timeout (4 seconds). */
    

    BTW, when I tried to change SLAVE LATENCY, I got the error on sd_ble_gap_ppcp_set() for peripheral and on sd_ble_gap_connect() for central. If it is needed to set connection interval to 30 ms and latency to 33 (at total 1 sec period with 30 ms interval), how it can be made?

  • Can you set MAX_CONN_INTERVAL to 60 for the peripheral as well? Slave latency is used for saving power (you sleep through x connection events), do not use this if you need throughput.

Related