I'm testing a simple application based on the UDP/CoAP example. It samples some peripherals and ADC channels, these values are then packed into a CoAP message. This CoAP message is sent trough the socket api (UDP), and the application waits for a response. I've run some tests using different power saving settings. The current version uses PSM in between samples and uses RAI to prevent long time spent in the 'active timer'.
A observation we made after analyzing the logs is the high rate of timeouts/packet loss. We declare a packet lost if the uplink CoAP packet, or the downlink CoAP ACK message is lost. We got a packet loss of around 3% over around 1000 messages (multiple devices, connected to the same cell). Questions:
- I known UDP is a unreliable protocol by design, but I'm wondering what packet loss numbers I should expect? What numbers are other users experiencing?
- Is there some way to identify the cause of these lost packets, using the PCAP/modem log files for example?
The devices all have a Taoglas MFX3 antenna connected, the RSSI is around -85dBm (%CESQ: 54,2,21,3).