Configuring PSM, eDRX, RAI for lowest possible power.

Environment

  • My hardware is the nRF9151DK dev kit and a custom board with nRF9151 (Monkeytronics Airsmart Cellular Beta 3).
  • Using nRF tools and SDK v3.1.1.
  • Using Onomondo and softsim - which incidentally is causing some trouble.
  • Based in New Zealand and currently on Spark network. 

Goal (simplified) - this is what I want to achieve.

  • Wake every hour, take some readings and send uplink packet of 100 - 500 bytes. 
  • Occasionally, a downlink packet will be sent upon receipt of uplink. Will be held pending in Monkeytronics cloud.
  • Achieve lowest possible average power consumption. 

Power Saving Plan

  • PSM enabled.
  • Periodic tau = 24 hours ("00111000")
  • Active time = 12 s ("00000110")
  • eDRX enabled.
  • eDRX period = 5.12 s ("0000")
  • PTW = 1.28 s ("0000")
  • RAI Enabled.

Questions

  1. Are the above settings sensible? And well suited to achieving lowest power consumption?
  2. Is there any reason why these settings might be unachievable with Spark NZ? I will also contact them to check.
    1. Note a): the lte event handler sees "network is rejecting the PSM paramaters (with active  time = -1)".
    2. Note b) the lte event handler see RAI event only once and logs: "MCC: 530, MNC: 5, AS RAI: false, CP RAI: false".
  3. Can you confirm if either or both of the following approaches will successfully implement the above power saving plan:

## PSM
CONFIG_LTE_PSM_REQ=y
CONFIG_LTE_LC_PSM_MODULE=y

## eDRX
CONFIG_LTE_EDRX_REQ=y
CONFIG_LTE_LC_EDRX_MODULE=y

## RAI
CONFIG_LTE_RAI_REQ=y
CONFIG_LTE_LC_RAI_MODULE=y

/* just after nrf_modem_lib_init() */

lte_lc_psm_param_set("00111000", "00000110"); // 24 hours &12 seconds
lte_lc_psm_req(true);

lte_lc_ptw_set(LTE_LC_LTE_MODE_LTEM, "0000"); // 1.28 seconds
lte_lc_edrx_param_set(LTE_LC_LTE_MODE_LTEM,"0000"); // 5.12 seconds
lte_lc_edrx_req(true);

rai_set();

  • Is there any reason why these settings might be unachievable with Spark NZ?

    In my "indirect experience" (someone else did the tests with my coap client about 1.5 years ago), Spark NZ has very different base stations.

    In that tests NB-IoT offered PSM and CP-RAI, and LTE-M offered PSM and AS-RAI. But unfortunately not on all the base stations.

    I'm afraid, you will need modem traces and a lot of patience.

  • Hello,

    I would recommend you look at the configs used in the NCS samples, like this one; https://docs.nordicsemi.com/bundle/ncs-latest/page/nrf/samples/cellular/udp/README.html#additional_configuration .

    If you want to know what parameters are optimal, you can use the Online Power Profiler tool. Even though it's for nrf9160, it should give you a good idea about what parameters to choose also for the nrf9151.

    But your best option is to talk to your network operator to decide on the parameters. They will also be able to tell you what PSM values will be accepted.



    • PSM enabled.
    • Periodic tau = 24 hours ("00111000")
    • Active time = 12 s ("00000110")
    • eDRX enabled.
    • eDRX period = 5.12 s ("0000")
    • PTW = 1.28 s ("0000")
    • RAI Enabled.

    What do you expect from using PSM and eDRX together?

    Just in the case, you want some more details, you may find them in Improving Energy Efficiency for Mobile loT, GSMA 2022 .

  • Thanks for your helpful responses. My thinking on the use of PSM, AS-RAI and eDRX together was to set the periodic tau higher than necessary, with the expectation to wake every 1 or 2 hours or so to send a message. I would expect RAI to then send the device into the eDRX mode, where it would be available to incoming messages during the 1-2 paging windows that should fit into the 12 second active timer period.


    Likely my intent may conflict with practical considerations, but I feel the more info I gather from others' experience, the more sensible my tests can be. Honestly, I'm not certain what the various settings do or how they might interact. 

  • OK, at least it is reasonable, why that combination is chosen.

    If that's worth, may depend on a lot of things.

    Tests in "laboratories" provides precise numbers, but that numbers may fail to validate in the field.

    Let me therefore recommend:

    - do no only measure the energy in you lab, also do a field test with several devices in different locations to see, if the lab values are reflecting the real usage. In my experience using common LiPos have the disadvantage, that testing only a week or two doesn't easily show the energy consumption, but you may try it. I use in the meantime super-caps for that. They have a close to linear function and are much better to see the energy consumption in a 1 or 2 weeks test run.

    - start with some very easy case, e.g. one-way with RAI_LAST, PSM 0s active time, no eDRX. That should give you an estimation of the very lowest energy consumption, but doesn't provide the possibility, to send data back.

    - test with you advanced cases and see, if that changes the energy consumption in a relevant way.

    - beside the idea of having always an "optional response", it may be worth to check, if a system sending clear one-way messages and only sometimes a message and expect a response, will also work in your case.

    - you may also check, if sending always a response (using RAI_ONE_RESP (CP-RAI) and RAI_NO_DATA (AS_RAI)), maybe mainly a empty one, has a significant larger energy consumption or not.

Related