This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

BLE lost messages

Hi, as part of my project i have 2 nordic 52dks one as a pheripheral (sending) and on as a client (recieving). where i am sening a buffer of 8, in the order

buffer[0] = 10

buffer[1}= X gyro lsb

buffer[2] = X gyro msb

buffer[3] = y gyro lsb

buffer[4] = y gyro msb

buffer[5] = z gyro lsb

buffer[6] = z gyro msb

buffer[7] = 36   //terminator $

however sometimes i seem to lose a byte over ble such that the next byte received will be less and hence the order is correupted thus making processing rather and unpredictable for a large sample size like 1000000 samples.

how may i ensure that my ble settings are correct such that i ensure to limit this issue.

my code was based of the examples ble_app_uart and ble_app_uart_c for the peripheral and client respectivly.

thank u in advance 

Parents
  • Hello,

    however sometimes i seem to lose a byte over ble such that the next byte received will be less and hence the order is correupted thus making processing rather and unpredictable

    No byte is lost over the BLE link, it is not possible as the protocol uses the CRC field to verify that the contents of the packet is uncorrupted and received as it was sent. If the CRC does not match, the packet is discarded and not acknowledged - leading to a re-transmission of the same packet.
    If you would like to confirm the communication happening on-air between the devices you could use the nRF Sniffer tool to trace the communication and read the individual packet contents.

    How is your handling / processing of the received data? How are you alerted to the seemingly corrupted data? Have you checked whether the corrupted values could be what the peripheral intended to send?

    Looking forward to resolving this issue together!

    Best regards,
    Karl

  • Firstly i would like to thank you for your reply, and for the sniffer, is there any documentation on how to install this please?

    The way im checking that the data is being missed is through the uart terminal of the client (recieving) board and also i have a matllab script that is reading serially aswell. to check this i sent known data meaning sending 8 bytes (hex) 10-16 and a terminator 36.

    and as u can see in the first colomn its fine but then in the second colom the 12 didnt send or be recieved which will therefore mess up my data 

  • NikTheNordicUser said:
    im not sure if wat im saying is understandable.

    I think I get the picture, but thank you for the understanding.

    NikTheNordicUser said:
    umm.. im not really sure is there a max value ?

    I dont think there is a max value other than the physical limitations of the SoC - i.e how much RAM do you have to spare?
    However, allocating everything that you have spare is not the best approach for this. Rather, you should look at how much that will actually be transferred in each interval.
    For example, how much throughput do you need to have all the necessary sensor data sent each second, and how much latency can you have in your system?
    If you group multiple measurements together up till the MTU size, and increase the connection event lengths, you will be able to send multiple larger packets per connection interval, for instance.
    But, in order to determine what configuration you should use, you need to know more exactly how many bytes you intend to transfer, and how often you intend to transfer them.
    You could take a look at the BLE throughput tables in the SoftDevice docuementation to determine the best connection parameters and configuration for your application.

    It could also be helpful for you to check out the the Online Power Profiler to visualize how each transfer will happen, with the different connection configurations.

    Please do not hesitate to ask if any part of my answer should be unclear!

    Best regards,
    Karl

  • ok so let say i want to have this configuration 

    is it possible to be shown how or what i would need to change. also as previously mentioned i would like to remove the delay such that i make it interrupt based and use the BLE_GATTS_EVT_HVN_TX_COMPLETE event which should ensure no resetting of the mcu .

    i would like to transfer a max of 242 bytes at once if possible with an output data rate of max 400 (the sesnor supports a data rate of 2000) but this setting should be enough for my application which setting do you suggest and the connection between the pheripheral and client is always active such that if theres data is constantly sent at the set settings

  • but anything i change results in the central not connecting so i am very confused at what is going on and ideally i would like to have the configuration of the fastest transfer

Reply Children
  •  i would like to have a similar approach to this, such that i send a packet at a tie will i recive the flag is finished

  • NikTheNordicUser said:
    as previously mentioned i would like to remove the delay such that i make it interrupt based and use the BLE_GATTS_EVT_HVN_TX_COMPLETE event which should ensure no resetting of the mcu .

    As previously mentioned you have to implement the handling for the BLE_GATTS_EVT_HVN_TX_COMPLETE event in the ble_evt_handler. For more detail please see the answer by my colleague Einar in your other ticket. If anything still is unclear on how you should proceed with this, please try to be more specific in your questions.
    Please try to implement this as I have said previously - when a NRF_ERROR_RESOURCES is returned from ble_app_data_send, have the data placed in a buffer and set a flag to indicate that the buffer holds data. Then implement the BLE_GATTS_EVT_HVN_TX_COMPLETE event that checks whether the buffer holds data, and if it does, queues it for transfer.
    Furthermore, the MCU does not have to reset when the NRF_ERROR_RESOURCES error code is returned from the call to ble_nus_data_send. This is the default error handling - you are free to implement specific error handling for your specific application, so that the device is able to handle the error without having to reset.

    NikTheNordicUser said:
    but anything i change results in the central not connecting so i am very confused at what is going on and ideally i would like to have the configuration of the fastest transfer
    NikTheNordicUser said:
    ideally i would use a throughput of 2 PHy 
    NikTheNordicUser said:
    i currently have this configuration

    The central determines the connection parameters. The peripheral may only either accept the parameters that the central send, or disconnect. If there is a mismatch between the preference of the central and the peripheral, the peripheral will send a parameter update request. This request can be declined by the central, and the peripheral may then either accept this or disconnect.
    If you with to change your connection parameters you will therefore have to change them on the central side.
    Then, you should also update the preference of the peripheral to match the changes you've made on the central side, or make sure that the peripheral does not disconnect if its preferences are not met.
    If you wish to set the connection interval to a specific number, like 50 ms, then you have to set both _MIN and _MAX to 50 ms, forcing this particular interval.

    NikTheNordicUser said:
    i would like to have a similar approach to this, such that i send a packet at a tie will i recive the flag is finished

    To send multiple notification packets back-to-back in the connection event, the event length needs to be configured to be the same as the connection interval, like mentioned directly before the throughput table you've shared a screenshot of. What is your event length?

    Try these two things, and let me know if anything is unclear, or if you encounter any issues.

    Best regards,
    Karl

  • Hi , karl , firslt i would like to thank you and i finally solved the error 19 issue , apologies for being unclear at times as i still a student and this project is for my final year project which is due very soon so i may come accross as unclear. now my only 'issue' is speed.. as i would like to make it abit faster.

    my current pheripheral settings are as follows

    are there any other settings that i may be missing. also just to be sure im understanding the connection interval is the connection between central and pheripheral right? and the app_adv_interval is how long the data paket takes? or is that the duration.

    What is your event length?

    in the peripheral it is set to 6.

    The central determines the connection parameters. The peripheral may only either accept the parameters that the central send, or disconnect. If there is a mismatch between the preference of the central and the peripheral, the peripheral will send a parameter update request. This request can be declined by the central, and the peripheral may then either accept this or disconnect.

    to ensure i understood.. so i need to make the pheripheral and client equal parameters and the client paramters are changes in the sdk config header right?

    also is there a way to measure the sepad of the bluetooth using my oscilloscope. you have previously sugggested to download nrf_sniffer and i attempted to but i honeslty couldnt manage to donwloaded as i never used pyton and got stuck in the very beiging when it said to open the command promt in the folder.

    once again i apologies for the constant messages and larger questions

    Thank you in advance

  • Hello,

    NikTheNordicUser said:
    firslt i would like to thank you and i finally solved the error 19 issue

    No problem at all, I am happy to help! I am also glad to hear that you have resolved the NRF_ERROR_RESOURCES error code being returned from ble_nus_data_send!
    Did you manage to implement the _HVN handling to queue the failed notification as I suggested?

    NikTheNordicUser said:
    apologies for being unclear at times as i still a student and this project is for my final year project which is due very soon so i may come accross as unclear.

    No need to apologize, I try to understand as best I can. Thank you for the understanding though! :) 

    NikTheNordicUser said:
    just to be sure im understanding the connection interval is the connection between central and pheripheral right?

    The connection interval is the time between each connection event. A connection event is the event in which the devices actually communicate.
    The BLE protocol is low power because the radio is active as little as possible (among other things). The radio consumes a lot of power, so minimizing the time it is on reduces overall power consumption. So, in BLE the devices agree on specific intervals in which they will both turn on their radios to communicate - so that it does not have to be on all the time, and waste power. By lowering the connection interval you are increasing power consumption, but gain reduced latency and increased throughput.
    Please let me know if any part of this should be unclear.

    NikTheNordicUser said:
    in the peripheral it is set to 6.

    With units of 1.25 ms, this translates to a 7.5 ms connection interval window.
    In the screenshot you have shared(which is the peripheral side, so it is still just the peripherals preference) you have configured the minimal connection interval to 20 ms, and the max to 75 ms - so you are not doing what the throughput documentation has told you to do, which is to have the event length be equal to the connection interval length.
    It is important that you read the documentation properly, if you hope to achieve the results it detail.

    The connection event length is the time the devices are allowed to stay talking each connection event. So, in the case that the devices have 'infinite'(for all practical purposes) information to communicate, they will still only communicate back and forth for 7.5 ms each 20th ms (if they use the lowest possible connection interval).
    If you set the event length to be equal to the connection interval length, they may communicate all the way up until the next connection event. This is essential for throughput (but consumes more power).

    NikTheNordicUser said:
    the app_adv_interval is how long the data paket takes? or is that the duration.

    The app_ADV_interval is the advertising interval, which specifies how frequently the device will send out advertising packets. Once a connection is established, the advertising stops(if the peripheral device only wants to hold 1 concurrent connection). The advertising interval does not matter for the throughput in your connection.

    NikTheNordicUser said:
    so i need to make the pheripheral and client equal parameters and the client paramters are changes in the sdk config header right?

    The corresponding words are peripheral and central, and client and server, but yes. You need to make sure that the connection parameters are the same on both sides to ensure that the connection will use those specific parameters because the central have to determine them (tell the peripheral what to use), and the peripheral has to accept them (not disconnect). 

    NikTheNordicUser said:
    is there a way to measure the sepad of the bluetooth using my oscilloscope.

    This depends. Without complex monitoring you can use the oscilloscope to measure the activity on the antenna (following all the procedures and using required equipment to measure on the antenna without distorting the signals). This will at least tell you that the devices are filling up their connection event with packets back and forth. It is however not an accurate (by any means) way to measure throughput, since you will not know if a packet is being retransmitted, for example.
    To measure throughput you could instead create a setup that always have more data to send, and then see how much data the other device receives in a given timespan. For example, setup a timer that triggers a disconnect (or just logs how much is received) after a given time, starting when the first packet is received. This way, you can much more accurately estimate your throughput.

    However, the method I would recommend you is to setup the sniffer tool that I mentioned earlier, and see the on-air BLE traffic for yourself. This way, you can see exactly what is being sent between the two devices, what connection parameters is being used, how big each packet is, how much of the connection event is spent communication, etc. You would then have the answer to all this, immediately.
    If you still dont feel confident that the sniffer is worthwhile to install you should instead do as I detail above and create a setup to measure the achieved throughput - it is not a lot of work (neither is setting up the sniffer).

    NikTheNordicUser said:
    got stuck in the very beiging when it said to open the command promt in the folder.

    You can open a command prompt in a given folder by writing 'cmd' into the file explorer address line. There is also not a lot that needs doing in python, please see the steps detailed in the nRF Sniffer documentation for details.

    Best regards,
    Karl

Related