This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

125 kbps PHY Rx Sensitivity Not As Low As It Should Be

Hi - 

In reading through the data sheets for the Nordic chips that support the low-rate (125 kbps) BT 5 PHY, I'm seeing Rx sensitivity numbers around -103 dBm. For the 1 Mbps PHY, it's around -96 dBm. I would expect the Rx sensitivity for 125 kbps to be 12 dB lower than 1 Mbps, i.e., -96 - 12 = -108 dBm. 9 of the 12 dB comes from reducing the bit rate by 8x, the other 3 dB comes from coding gain added to the low rate S=8 PHY. Why are you guys leaving 5 dB on the table? This is a big miss, in my opinion, since 5 dB amounts to almost a 2x range increase outdoors!

Do you have any plans in the future to improve the PHY performance to make up for the 5 dB shortfall?

Thanks,

Gary Sugar

Parents
  • I'm personally an expert when it comes to digital communications and signal processing, so I'm not buying your answer - sorry. For example, if you can do -96 dBm at 1 Mbps, it would be trivial to lower the sensitivity by 9 dB just decreasing the data rate to 125 kbps - without any coding. It doesn't matter what extra steps you took to achieve -96 dBm at 1 mbps. Can you please maybe ask an expert to help you come up with a better explanation for this? There has to be some detailed reason for the shortfall that would make sense to someone like me who's experienced with this kind of stuff, e.g., your the preamble can't be reliably detected below -103 dBm even though the data can be reliably decoded down to -108 dBm (if this happened it could be a protocol design issue - a bad one). Another example explanation would be you're using a phase coherent demod or an equalizer of some sort for 1 Mbps, but haven't had the time yet to develop it for 125 kbps.  I'm trying to figure out two things here: (1) what's the specific reason for the 5 dB shortfall, and (2) can it be fixed in the future?

Reply
  • I'm personally an expert when it comes to digital communications and signal processing, so I'm not buying your answer - sorry. For example, if you can do -96 dBm at 1 Mbps, it would be trivial to lower the sensitivity by 9 dB just decreasing the data rate to 125 kbps - without any coding. It doesn't matter what extra steps you took to achieve -96 dBm at 1 mbps. Can you please maybe ask an expert to help you come up with a better explanation for this? There has to be some detailed reason for the shortfall that would make sense to someone like me who's experienced with this kind of stuff, e.g., your the preamble can't be reliably detected below -103 dBm even though the data can be reliably decoded down to -108 dBm (if this happened it could be a protocol design issue - a bad one). Another example explanation would be you're using a phase coherent demod or an equalizer of some sort for 1 Mbps, but haven't had the time yet to develop it for 125 kbps.  I'm trying to figure out two things here: (1) what's the specific reason for the 5 dB shortfall, and (2) can it be fixed in the future?

Children
No Data
Related