Hi all,
In this link, it said the rssi value should be a negative value, but the register stored a positive value.
So when I read a value of 60 for example the RSSI is -60dBm.
But in my test, I use this code to choose a relatively stable channel.
static int esb_get_rf_channel_rssi(uint8_t ch) { int rssi_reading = 0; nrf_esb_set_rf_channel(ch); nrf_esb_start_rx(); nrf_delay_us(300); for(int i = 0; i < 10; i ++) { NRF_RADIO->TASKS_RSSISTART = 1; while(NRF_RADIO->EVENTS_RSSIEND == 0); rssi_reading += NRF_RADIO->RSSISAMPLE; nrf_delay_us(100); } nrf_esb_stop_rx(); nrf_delay_us(300); return rssi_reading/10; }
the result is here
I scanned it multiple times, and choose a channel with a smaller value change.
At first, I choose a minimum value(e.g. 44) to set the channel and the packet lost rate almost 50%, at 1.5M without retransmit
Then I choose the max value( e.g. 100 or something), in this channel I found the packet lost rate about 10%, 1.5M without retransmit.
I think the - 60dBm(for example) should better than -100dBm.
Note: the channel with minimum value has a wider range of fluctuations than the channel with max value.
So, Is there something wrong with my test? Or my My test environment is bad?
The SDK is 17.1.0, nRF52832.
Best regards,
Lurn