This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

RAI and Sockets

I'm testing a simple application based on the UDP example.  It samples some peripherals and ADC channels, these values are then packed into a CoAP message. This CoAP message is sent trough the socket api, and the application waits for a response:

  1. Create CoAP message
  2. Send CoAP message using send(), no flags
  3. Use poll() to poll for response
  4. recv() when poll indicated there is something to read
  5. repeat after a couple of seconds

This works fine with RAI disabled, but the power draw is to high, the response usually comes after 1-2 seconds, and with the network RCC timeout of 10 seconds a lot of time is wasted waiting for nothing. Now I want to enable RAI, I came across several options:

  • Use 
    lte_lc_rai_param_set("3");
    lte_lc_rai_req(true);
  • Use AT%RAI
  • Use AT%XRAI
  • Use 
    err = setsockopt(sock, SOL_SOCKET, SO_RAI_ONE_RESP, NULL, 0);

Which one to use? I'm using modem FW 1.3.1 and sdk 1.7.0, all on NB-IOT.

Parents Reply Children
No Data
Related