Modem voltage measurement period

Hello DevZone,

I have a project where I currently use the AT%XPOFWARN feature, in order to turn off the modem in case the battery voltage is too low. It's used in both cellular and GPS mode.

This feature has been enabled because the the battery used is a bit weak in terms of instantaneous current.The goal is to stop the modem before the voltage drop too low and reset the entire system. The trigger is set to 3.0V.

However, it seems a bit sensible : the battery is on parallel with a 1F supercapacitor, and the XPOFWARN still get trigger quite often. It's a bit hard to get good measurements, but it looks like the voltage drops for a few µs (or ms ?) and goes back to an acceptable voltage. This tiny drop is enough to trigger XPOFWARN.

I understand that this is expected from AT%XPOFWARN, but it's too sensible for my case.

My question is : is there a way to have this feature less sensible to very short voltage variations ?

- What is the default sampling rate of XPOFWARN  ?

- Is there a way to adjust the sampling used for XPOFWARN  ?

- If not, is it possible to setup an average, for example on a 1 sec ?

Similar questions for AT%XVBATLOWLVL. The documentation only indicates "The modem reads sensors periodically in connected mode. The default period is 60 seconds. If the temperature or voltage gets close to the set threshold, a shorter period is used" with no details.

Is it a good approch to use AT%XVBATLOWLVL to replicate a "slow tigger AT%XPOFWARN" ? Or is it better to use the ADC (battery voltage is already measured by the ADC on my PCB).

Thanks.

Vincent

Related