This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

nrf51822 throughput iPhone6 BLE 4.0 iOS 10.3

Hello,

I'm using a nRF51822 to send data to the iPhone using the NUS service. I'm using the oldest SDK4.x and v5.x S110 soft device. I know that the least connection interval for iOS 10 is 30ms and the number of packets per connection interval is 3.

Considering the maximum payload of 20bytes per packet, the throughput is calculated as follows:

Throughtput = (1/30ms) * 3 * 20Bytes = 2000 Bps

Considering, I'm sending only 1 byte, instead of the maximum payload of 20bytes:

Throughtput = (1/30ms) * 3 * 1Byte = 100Bps

100 Bps means 1 Byte of data being sent 1/100 of every second.

Currently, I have a timer, which goes off every 0.01, which is 1/100 of every second. When the timer goes off, the sensor values are sent to the iPhone. When I log the data on the iPhone over a period of 5 mins, what I receive is only 66.667Bps and not 100Bps.

I have been trying to track down this issue but I'm not successful till now. Can anyone help me out with this?

Parents
  • Footnote: 100Bps doesn't mean that 1B is sent every 1/100 second. It is saying that average throughput is 100 bytes per second meaning that if you measure amount of data transferred over longer interval, lets say 10 seconds or rather 100 then you will see value very close to 100Bps (there might be some packet loss/retransmission events). I actually believe that problem is in your app: you simply don't utilize the throughput correctly with your 0.01s timer. Do you have proper queue on timer/sensor side which logs data for future use by UART service and then on NUS side independent state machine which takes data ready and prepare next Tx packet (which will go out in 30ms)? Connection interval is the main "timer" in your system not some artificial 0.01s.

Reply
  • Footnote: 100Bps doesn't mean that 1B is sent every 1/100 second. It is saying that average throughput is 100 bytes per second meaning that if you measure amount of data transferred over longer interval, lets say 10 seconds or rather 100 then you will see value very close to 100Bps (there might be some packet loss/retransmission events). I actually believe that problem is in your app: you simply don't utilize the throughput correctly with your 0.01s timer. Do you have proper queue on timer/sensor side which logs data for future use by UART service and then on NUS side independent state machine which takes data ready and prepare next Tx packet (which will go out in 30ms)? Connection interval is the main "timer" in your system not some artificial 0.01s.

Children
No Data
Related