This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

125 kbps PHY Rx Sensitivity Not As Low As It Should Be

Hi - 

In reading through the data sheets for the Nordic chips that support the low-rate (125 kbps) BT 5 PHY, I'm seeing Rx sensitivity numbers around -103 dBm. For the 1 Mbps PHY, it's around -96 dBm. I would expect the Rx sensitivity for 125 kbps to be 12 dB lower than 1 Mbps, i.e., -96 - 12 = -108 dBm. 9 of the 12 dB comes from reducing the bit rate by 8x, the other 3 dB comes from coding gain added to the low rate S=8 PHY. Why are you guys leaving 5 dB on the table? This is a big miss, in my opinion, since 5 dB amounts to almost a 2x range increase outdoors!

Do you have any plans in the future to improve the PHY performance to make up for the 5 dB shortfall?

Thanks,

Gary Sugar

  • Hi,

    Unfortunately, BTLR125kb does not categorically add 12 dB of sensitivity. For 1 mbit we where already applying some methods to improve the range, so we do not get the theoretically maximum improvement compared to our 1mbit performance. Regardless you can find more details on BTLR here: https://devzone.nordicsemi.com/nordic/nordic-blog/b/blog/posts/testing-long-range-coded-phy-with-nordic-solution-it-simply-works-922075585

  • I'm personally an expert when it comes to digital communications and signal processing, so I'm not buying your answer - sorry. For example, if you can do -96 dBm at 1 Mbps, it would be trivial to lower the sensitivity by 9 dB just decreasing the data rate to 125 kbps - without any coding. It doesn't matter what extra steps you took to achieve -96 dBm at 1 mbps. Can you please maybe ask an expert to help you come up with a better explanation for this? There has to be some detailed reason for the shortfall that would make sense to someone like me who's experienced with this kind of stuff, e.g., your the preamble can't be reliably detected below -103 dBm even though the data can be reliably decoded down to -108 dBm (if this happened it could be a protocol design issue - a bad one). Another example explanation would be you're using a phase coherent demod or an equalizer of some sort for 1 Mbps, but haven't had the time yet to develop it for 125 kbps.  I'm trying to figure out two things here: (1) what's the specific reason for the 5 dB shortfall, and (2) can it be fixed in the future?

  • Sorry - one more comment. I'm not trying to be a jerk about this - sorry if I'm coming across that way. I was just really surprised and disappointed to see that what I thought was going to be a 12 dB range increase turned out to only be a 7 dB increase. I'm working on a very range sensitive IoT application. 

  • I am sorry you are disappointed with the range improvement. May I ask what type of application you are working on?

    Comparing 1mbit to Long Range we are "gaining" 3 dB for 1M compared to the 12 db improvement for LR stated by the BT sig. This is because we where already using a discriminator type receiver for BTLE1M and are therefor missing out on that improvement. The rest of the reduction  is unfortunately a trade off between selectivity and sensitivity, which is something we will try to improve in future products. But our expectation is that BTLE125kbits would be up to 9 dB better than 1mbits.

  • I'm working on a healthcare IoT application, where there are lots of tags in a hospital running around that are beaconing at 125 kbps. In the long run, the hospital networking equipment will have BT receivers built-in (along with WiFi, typically). But before that happens, hospitals will need to use an overlay network of gateways to pick up the BT transmissions. I say overlay because most hospitals already have WiFi APs now, but they're not yet equipped with BT of any sort. Asking a hospital to purchase and install a second overlay network of BT gateways is OK if there are only 2-4 gateways per floor. But a 6 dB drop in sensitivity could easily quadruple the number of gateways, making our overall business case less compelling. 

    Your technical description of why you're giving up the 5 dB still isn't clear. If you use the SAME EXACT detection scheme for LR as you do for 1 mbps, you would gain 9 dB in sensitivity - this is because the data rate is 8 x slower, and 10*log10(8) = 9 dB. So why don't you just do that? Then you'll get another 3 dB from the coding gain. Your discriminator and selectivity vs. sensitivity tradeoff makes no sense to me. Discriminator-based detection is known to be 3 dB worse than coherent non-discriminator based detection, and could explain a 3 dB loss in performance somewhere, but it doesn't answer my question above about using the same exact detection scheme. Both the discriminator and sensitivity vs. selectivity comments sound like they came from someone who doesn't really understand how the Rx demod works throwing out buzzwords to try and appease an upset customer. This is a really important issue for me and any other customer that wants long range to really be long range. And therefore it's an important issue for Nordic. So I think it's worth taking the time to answer this correctly. Should probably even have a white paper on this. 

Related