I have an NRF52832 Receiver chipset board that detects a tag that is transmitting 10 bytes of payload every second.
I have an attenuator on the board before the antenna of the Receiver so that I can give required attenuation level.(gains).
I have gains of 0db, -4db, -8db, -12db, -16db, -20db, -24db and -28db. If I give a gain of 0db to the attenuator, I get a maximum distance of 20 meters. The tag gets detected in the 20 meter range. If I give a gain of -28db to the attenuator, I get a maximum distance of 3 meters.
For 0db gain, I am getting an RSSI value of 90 at 20 meters. For -28db gain, I am getting an RSSI value of 90 at 3 meters. For any gain I set at the attenuator, I am getting RSSI of about 90 at its maximum distance.
Why is the RSSI value dependent on the Gain of the attenuator?
I was under the impression that RSSI value will change as per the distance between the Receiver and the transmitter.
Rgds,