This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

125 kbps PHY Rx Sensitivity Not As Low As It Should Be

Hi - 

In reading through the data sheets for the Nordic chips that support the low-rate (125 kbps) BT 5 PHY, I'm seeing Rx sensitivity numbers around -103 dBm. For the 1 Mbps PHY, it's around -96 dBm. I would expect the Rx sensitivity for 125 kbps to be 12 dB lower than 1 Mbps, i.e., -96 - 12 = -108 dBm. 9 of the 12 dB comes from reducing the bit rate by 8x, the other 3 dB comes from coding gain added to the low rate S=8 PHY. Why are you guys leaving 5 dB on the table? This is a big miss, in my opinion, since 5 dB amounts to almost a 2x range increase outdoors!

Do you have any plans in the future to improve the PHY performance to make up for the 5 dB shortfall?

Thanks,

Gary Sugar

Parents
  • I am sorry you are disappointed with the range improvement. May I ask what type of application you are working on?

    Comparing 1mbit to Long Range we are "gaining" 3 dB for 1M compared to the 12 db improvement for LR stated by the BT sig. This is because we where already using a discriminator type receiver for BTLE1M and are therefor missing out on that improvement. The rest of the reduction  is unfortunately a trade off between selectivity and sensitivity, which is something we will try to improve in future products. But our expectation is that BTLE125kbits would be up to 9 dB better than 1mbits.

Reply
  • I am sorry you are disappointed with the range improvement. May I ask what type of application you are working on?

    Comparing 1mbit to Long Range we are "gaining" 3 dB for 1M compared to the 12 db improvement for LR stated by the BT sig. This is because we where already using a discriminator type receiver for BTLE1M and are therefor missing out on that improvement. The rest of the reduction  is unfortunately a trade off between selectivity and sensitivity, which is something we will try to improve in future products. But our expectation is that BTLE125kbits would be up to 9 dB better than 1mbits.

Children
  • I'm working on a healthcare IoT application, where there are lots of tags in a hospital running around that are beaconing at 125 kbps. In the long run, the hospital networking equipment will have BT receivers built-in (along with WiFi, typically). But before that happens, hospitals will need to use an overlay network of gateways to pick up the BT transmissions. I say overlay because most hospitals already have WiFi APs now, but they're not yet equipped with BT of any sort. Asking a hospital to purchase and install a second overlay network of BT gateways is OK if there are only 2-4 gateways per floor. But a 6 dB drop in sensitivity could easily quadruple the number of gateways, making our overall business case less compelling. 

    Your technical description of why you're giving up the 5 dB still isn't clear. If you use the SAME EXACT detection scheme for LR as you do for 1 mbps, you would gain 9 dB in sensitivity - this is because the data rate is 8 x slower, and 10*log10(8) = 9 dB. So why don't you just do that? Then you'll get another 3 dB from the coding gain. Your discriminator and selectivity vs. sensitivity tradeoff makes no sense to me. Discriminator-based detection is known to be 3 dB worse than coherent non-discriminator based detection, and could explain a 3 dB loss in performance somewhere, but it doesn't answer my question above about using the same exact detection scheme. Both the discriminator and sensitivity vs. selectivity comments sound like they came from someone who doesn't really understand how the Rx demod works throwing out buzzwords to try and appease an upset customer. This is a really important issue for me and any other customer that wants long range to really be long range. And therefore it's an important issue for Nordic. So I think it's worth taking the time to answer this correctly. Should probably even have a white paper on this. 

Related