This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

How to send large data through BLE notification using nrf52832

Hi everyone,

I am writing a program in Keil whereby I have to send a large amount of data through BLE notification in real time. The program works by having two timers, one timer is used for the ADC to sample at 400Hz, so every 2.5ms, the other timer is used to see if the data is ready to be sent to the phone. The other timer scans every 1ms and sends the data on every 100 samples, so every 250ms. There are 6 ADC channels being sampled each time and each channel requires 2 bytes of storage so in total 12 bytes per sample. Therefore 1200 bytes are sent on every notification or every 250ms. However this notification is only ever switched on for a couple of seconds for data verification purposes.

This means a throughput of at least 4800 bytes is required however the current throughput is below this as I am losing packets. Below is my code for sending the notification, is there anyway of sending more packets or is it better for me to use a large array to hold the data for a couple of seconds then send them all at once instead of sending data every 250ms.

Below is my function for sending the notification

  void timer_timeout_handler2()
{
	if( (new_data == true) && (notification_enabled == true) )
	{
		for(int i=0; i<MAX_NUMBER_OF_SAMPLES; i++)
		{
			ble_cus_custom_value_update(&m_cus, data+2*NUMBER_OF_ADC_CHANNELS*i);

		}
		new_data = false;
	}
}

Right now it is calling the notification function 100 times to send 100 samples worth of data every time the ADC has finished sampling 100 times. (MAX_NUMBER_OF_SAMPLES = 100), (NUMBER_OF_ADC_CHANNELS = 6). I do not know how the packets are packaged or whether this is the correct way of sending the data, as data array is only 12 bytes long instead of 20 for maximum throughput due to how data is process on the clients side. But maybe the softdevice is splitting the data into 20 byte chunks? Also from the datasheet it seems like only 6 packets can be sent per connection interval so I do not know how it handles sending the 100 samples, is each sample of 12 bytes treated as a packet? Is there anyway of spacing the data evenly? As right now on the client side I would get the data in chunks which I am presumed is due to the data being split across multiple connection intervals, however not all data can be sent before the ADC has finished sampling another 100 samples, hence the lost of packet. What would be a better way of implementing this?

Thanks for the help

Rgs, Bryan

  • Hi Bryan,

    Throughput of 4800 Bytes per second can be challenging when not controlling one part of the link (= if you want to use phone or similar "black" box with incomplete API to lower BLE layers). Most of Android devices use ~50ms connection interval when in GAP Central (master) role and even they usually support multiple MTUs per interval (let's say 6) not all of them accept extension of basic ATT_MTU size (which gives you effective space of 20B per "packet") or lowering the interval to 7.5-20ms range where you want it to be. So assuming any "dumb" cheap Android phone will give you 1000/50206=2600Bps in ideal case of no packet loss and re-transmission. As you see only with good phones where you can lower the interval or extend ATT_MTU size to be able to send more then 120B per 50ms you can get more but at the range of ~5000Bps you will definitely have problem to make it outside range of pre-selected "flag ship" devices (e.g. iPhones support today ATT_MTU size of at least ~150B and they will use connection interval of 30ms but it's question how many of such long MTUs will squeeze into one interval).

    I would definitely start with pure feasibility of your BLE connection throughput where you "stream" just dummy strings to see what are the ideal ATT_MTU and LL PRU sizes (these are properties of nRF52 stack you need to provision basically during compile time because of RAM allocation) and how low with connection interval you can go through Peripheral Connection Parameters Update Request procedure (this is more "per connection" thing). Finally you will need to evaluate it on as many "reference" phones as possible, especially if you want to market it later to general consumers. Once you find optimal values then you will see how you need to fragment your stream of ADC samples (or other data). As these are basically continuous in you case you should be free to go from 20-byte up to thousand-bytes packets as needed.

    Cheers Jan

Related