LTE-M consuming way to much power to transmit 100 bytes UDP every 15 seconds, please help me understand why.

I'm using the UDP sample from SDK 1.8.0 with the following modifications:

  • No PSM
  • No eDRX
  • Send 100 bytes every 15 seconds

This is on the nrf9160-DK board.

I suspect I can improve power consumption by disabling RRC idle mode, but I cannot find any documentation on how to do this.

I am unsure of the actual cDRX timing parameters as what seems like DRX events are sporadic.

Here are some power profile images and a modem trace if that's helpful.

The above image shows large period of almost continuous activity after the TX event for ~5 seconds and then iDRX activity until the next TX event.

This is the activity while RRC connected. 

I am seeing an average of 10mA, with 22mA during the connection period. I was expecting the average current consumption to be 2.5 - 4mA.

7723.trace-2022-02-14T21-46-37.030Z.bin

  • Want to add that RAI with LTE-M is a release 14 feature. Not sure how widely it is available but some networks do, and for activation a custom command needs to be used.

    %REL14FEAT




    But combining TCP and RAI is not recommended.
    When using TCP then you will never know if you are getting TCP re-transmits or not, so you will never know if the last packet you sent was the last one or a re-transmit. 


    Regards,
    Jonathan

  • Hi EDLT, many thanks for the reply.

    Interesting, I had experimented with REL14FEAT but without success. I actually had no luck getting NRF_SO_RAI_NO_DATA to work on NB-IoT and ended up using NRF_SO_RAI_LAST with an extra send to get a release. Maybe on LTE-M I need to go back to trying NRF_SO_RAI_NO_DATA as you do. 

    It could also be my use of the NRF_ versions with nrf_socket that is the difference. Or just that my network doesn't support it... I'll see what I can do. 

    Thanks again, and thanks Jonathan. 
    Jake

Related