My nRF52832 firmware application transmits data to iPad which converts it into a flowing waveform. The waveform flowed smoothly when SDK13.1 was used. I migrated the firmware to SDK15.2 and now the waveform increments in steps rather than flowing smoothly. The min and max connection intervals have not changed. I noticed with SDK13.1 there was no entry for NRF_SDH_BLE_GAP_EVENT_LENGTH in sdk_config.h, so I assumed it used the default of 3 assigned to BLE_GAP_EVENT_LENGTH_DEFAULT. However in SDK15.2 there is an entry for NRF_SDH_BLE_GAP_EVENT_LENGTH and it was set to 6, so I changed it to 3 to match the original when in SDK13.1. This did not help much.
In SDK15.2 the nrf_ble_gatt.c logs are:
Peer on connection 0x0 requested a data length of 251 bytes.Updating data length to 27 on connection 0x0.Data length updated to 27 on connection 0x0.max_rx_octets: 27max_tx_octets: 27max_rx_time: 1364max_tx_time: 1364on_ble_event: got BLE_GATTS_EVT_EXCHANGE_MTU_REQUESTPeer on connection 0x0 requested an ATT MTU of 293 bytes.Updating ATT MTU to 23 bytes (desired: 23) on connection 0x0.
In SDK13.1 the nrf_ble_gatt.c logs are:
Peer on connection 0x0 requested a data length of 251 bytes.Updating data length to 27 bytes on connection 0x0.Data length updated to 27 on connection 0x0.max_rx_octets: 27max_tx_octets: 27max_rx_time: 328max_tx_time: 328on_ble_event: got BLE_GATTS_EVT_EXCHANGE_MTU_REQUESTPeer on connection 0x0 requested an ATT MTU of 185 bytes.Updating ATT MTU to 23 bytes (desired: 23) on connection 0x0.
The glaring differences are the max_rx_time and max_tx_time. In SDK15.2, I can get those values reduced by decreasing NRF_SDH_BLE_GAP_EVENT_LENGTH value to lowest allowable of 2, which reduces the max_rx_time and max_tx_time, but they are still more than twice the value used in SDK13.1. It did slightly help the waveform though.
What other parameters should I be looking at to get my waveform flowing smoothly again? The max rx and tx times are set to BLE_GAP_DATA_LENGTH_AUTO in both SDKs. Are there other parameters that may be different between the 2 SDKs?
If I interpret this correctly, the time domain of your waveform is affected by the rate you receive the packets? I.e., the waveform will not be drawn smoothly if the client receives multiple samples per connection event, so you want to limit the transfer to one sample per connection. In that case, I think you can specify the max tx time in the nrf_ble_gatt module instead of using the BLE_GAP_DATA_LENGTH_AUTO feature.
That said, maybe it would be better to control the packet transfer rate on the peripheral side using an app timer instance,etc? Or use the timer instance to create timestamps which you append to the packets (provided you can fit it into your 20 byte payload). It should result in a more accurate waveform.
The BLE is receiving data through SPI at 1.5K samples per second, which must then get transmitted to the central via BLE notifications. It doesn't seem possible to achieve this if only transmitting one sample per connection event with 15mS connection intervals. I'm not looking for perfectly smooth, just don't want to look like it's stepping rather than flowing. The waveform at the central was flowing fairly smooth when using SDK13.1, but not smooth at all with SDK15.2. I don't understand why the max_rx_time and max_tx_time don't match between the 2 SDKs when I set NRF_SDK_BLE_GAP_EVENT_LENGTH to be the same.
I checked with the team, it's ok to manually set the rx/time to 328 us when you have 27 byte LL packets as long as you don't use Coded PHY. The new Softdevice will set aside more time for rx/tx packets as you observed.
Thanks. I'll try the change with SDK15.2 and see if it results in similar waveforms as using SDK13.1.
So now with SDK15.2, I have 328uS rx/tx time, same as with SDK13.1 auto selected value. My max rx/tx octets is 27 for both SDKs. My min/max connection intervals are the same for both SDKs at 30mS. My NRF_SDH_BLE_GAP_EVT_LENGTH is 3 for both SDKs.
With the SDK13.1 version, packets are consistently sent every 30mS with 3 packets per connection and only occasionally 1 or 2 packets per connection. This results in a fairly smooth waveform in the iPad app.
With the SDK15.2 version, the packet pattern looks like alternating 30mS and 90mS intervals with 5 packets sent at each connection. This results in the waveform stepping across the iPad screen rather than flowing smoothly.
Did the number of packets per connection change with the Softdevice change?
Do you have any other suggestions? Thank you for all of your help.