Bluetooth Low Energy Direct Test Mode Package Error Rate Measurement

I have been trying to measure the PER between two nRF52840 DKs using BLE. To do so, I am using BLE Direct Test Mode to start a transmission on a 52840 DK and another development kit to listen for the broadcast and count the packages. The BLE DTM standard does not incorporate the possibility to count the number of sent packets on the transmitter side, so I am trying to calculate a theoretical amount of sent packages using the formula supplied by "BLUETOOTH CORE SPECIFICATION Version 5.4 | Vol 6, Part F page 3063". 

To get to the packet length in bits i use the formula: 

packet_len = pdu_payload_len + 8 + 8 + 32 + 24 + 8 * phy_bandwidth 
with phy_bandwidth being the PHY mode [1Mbit/s | 2Mbit/s]
To get the LE Test packet length L in µs, I divide this amount by the phy_bandwidth:
L = packet_len / phy_bandwidth
I use L to calculate I(L) according to the specified formula of the core specification.
I then multiply I(L) with the amount of packets reported received by the receiver DK and divide by the time period the receiver was active.
package_error_rate = (1 - I(L) * packet_count / time_delta) * 100%
Unfortunately, using this method I calculate a PER of around 75% while the receiver and transmitter DKs are right next to each other. The transmitter is set to highest TX power in this scenario.
What could be the source of this problem? Is my approach of calculating the PER based on bandwidth and packet length incorrect?
Any help is much appreciated.
BASH:
pdu_payload_len=$(( packet_len_l6 + packet_len_higher * 16 )) # in bits
phy_packet_len=$(( pdu_payload_len + 8 + 8 + 32 + 24 + 8 * phy_bandwidth )) # in bits
#echo "$phy_packet_len"
phy_packet_dur=$(echo "scale=0; $phy_packet_len / $phy_bandwidth" | bc) #in µs
#echo "$phy_packet_dur"
transmit_interval_dur=$(( ((phy_packet_dur + 249 + 624) / 625) * 625 ))

##echo "transmit_interval_dur: $transmit_interval_dur"
PER=$(echo "scale=2; 100 - ($packet_count * $transmit_interval_dur) / ($delta * 10000)" | bc) #in %
Parents Reply Children
  • Unfortunately, the problem is not resolved by this. 

    Under ideal conditions, packets are received every 2.5ms, which is a much lower transmission rate than specified by the bluetooth sig. The theoretical packet rate should be one packet every 625µs. Could it be that there are some limits as to how fast a direct test mode device can receive packets? 

  • Hello Leon,

    The BLE DTM standard does not incorporate the possibility to count the number of sent packets on the transmitter side
    Leon Weigel said:
    Could it be that there are some limits as to how fast a direct test mode device can receive packets? 

    Yeah that might be. The DTM sample is more focused on the hardware side of things. It is designed to test the operation of the radio at the physical level, such as transmission power, receiver sensitivity, frequency offset and drift, modulation characteristics etc. 

    If this is a standardised test I haven't heard of then there might be a go to way to set this up. I can look into that, though do you know why you for instance do not simply use the BLE throughput sample?

    Regards,

    Elfving

  • Hello Leon,

    The BLE DTM standard does not incorporate the possibility to count the number of sent packets on the transmitter side
    Leon Weigel said:
    Could it be that there are some limits as to how fast a direct test mode device can receive packets? 

    Yeah that might be. The DTM sample is more focused on the hardware side of things. It is designed to test the operation of the radio at the physical level, such as transmission power, receiver sensitivity, frequency offset and drift, modulation characteristics etc. 

    If this is a standardised test I haven't heard of then there might be a go to way to set this up. I can look into that, though do you know why you for instance do not simply use the BLE throughput sample?

    Regards,

    Elfving

  • I will look into the throughput sample. It is mentioned in the documentation that the DTM can be used for PER measurements, which is why I decided to use it.

    For now, I believe I have finally fixed the problem by increasing the packet transmission interval from 625µs in the direct_test_mode sample application.

    I also found a potential bug in the direct test mode implementation:

    dtm.c line 85:

    /* Time between start of TX packets (in us). */
    #define TX_INTERVAL 625
    This does not seem to do anything. The TX_INTERVAL is not referenced anywhere in the application.
    The TX_INTERVAL instead seems to be hardcoded in line 1617 and 1620 in dtm.c as 625 (numerical).
Related