This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Expected PER while using DTM and 2 nRF-dev boards

Hi

I'm preparing for some tests and I'm wondering where I can find information about what the expected percentage error rate should be?

Right now with 2 nrf boards connected I get around 0-1% PER while testing every channel.

  • And this is correct until you go down to -30/-40dBm Tx Power or start to shield/obstruct some unit (or move it dozens meters away). Note that if you are using Nordic examples (Python script) you have already several packets as systematic error so depedning on your statistic you can have 100% reliable link but still report 1-5% PER (note that DTM only reports number of received packets on Rx side but not number of issued packets on Tx side, that's done by timing and if you are using slow UART and some other middleware - such as any PC with serial driver and OS running - then you are way in millisecond range of delays/inaccuracy while DTM packets are using 0.625ms timing).

  • So it's not possible to see the number of Tx messages sent in any way ? Also if I increase the length from 35 to 40 I suddenly get 100% error rate

  • Number of Tx packets is distinguished from timing between divided by 0.625ms, which can be precise if you have very good timing signalling. Note that DTM Tx commands are kind of "low cost" option, normally you should purchase one of these 10-100k USD radio synthesizers/analyzers which will tell you precisely how many packets were sent/received;) To your problem with 40B PDU length (is this what you are trying to do?): if you run DTM according to BT SIG spec v4.0/4.1 then it supports only PDU lengths 0-37B so you are out of range (DTM command is most probably ignored by the FW and so you get 100% packet loss because there is nothing transmitted/received).

  • I see ! Well what I actually want to do is a blocking test. So I want to send x amount of bytes and see how many bytes are lost.

  • Well that's not how DTM works. Receiver in DTM always checks integrity of PDU (there is 24-bit CRC as per BT LE packet definition) and then data length + content (there are 3 patterns available in DTM by specification). Any error means packet is logged as lost, however you have no idea what went wrong. To be honest that's how BLE and pretty much any radio works: once you receive something which doesn't match basic expectation for structure (such as preamble and integrity checksum) then you can hardly say if it was complete noise or just some part of the message got corrupted. From this perspective I don't understand what do you want to test...

Related