This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

RSSI Accuracy in dBm

Hi,

The documentation for the nRF52840 (section 6.20.15.9) states that RSSI has a typical accuracy of ±2 dBm. I have measured signal strength at multiple distances, and the resultant standard deviation of my data at each distance agrees with this value. However, I am surprised that standard deviation stays approximately constant at ±2 dBm, since dBm is a logarithmic scale. I would have expected standard deviation (accuracy) to remain constant with respect to SI units instead.

For example, after taking multiple measurements of signal strength at 20 cm and 50 cm, I calculated an average power of -20 dBm and -45 dBm respectively. Each set of values had approximately equal standard deviations of 2 dBm. However, after converting to SI units, the -20 dBm dataset obviously had a much larger variance than the -45 dBm dataset.

It appears as though there is a systematic error of up to 2 dBm present in readings. The only explanation I can think of is that this error is added during signal conversion by the ADC. Any clarification on this would be much appreciated.

Best wishes,

Thavish

Related