Hey guys,
I have been fighting with this for a couple of days and I am hoping someone can give me some insights as to what might be going on.
We currently use the nRF24L01P, and I am trying to evaluate the nRF52 as our potential replacement because it is backwards compatible with our current hardware via ESB. Our applications are very low power, so I need to prove that it doesn't break our power budget, but I am running into a road block.
I have a relatively simple application that basically does the following:
1) Setup and run app_timer for 1 second intervals
2) After 4 seconds elapses, call esb_init(), then setup a payload and call nrf_esb_write_payload()
3) In nrf_esb_event_handler(), call nrf_esb_disable()
What I am finding is that prior to the transmission I get about 2uA of power consumption (on average). Then after the transmission, I get 7-8uA average. Looking on a scope, the average goes up because the spikes caused by Refresh Mode of the DCDC/LDO are about 4 times more frequent.
My question is why is this the case? After I disable the radio, shouldn't it go back to the lower power mode it was in prior to the transmission? From what I have read about Refresh Mode, it sounds like it is periodically charging the caps. Why does it need to charge them more frequency after a single transmission has occurred? This rate does not change after the transmissions (in other words, it doesn't settle back to the original rate). I also only see this problem if i call the nrf_esb_write_payload() function. If i just call esb_init() but don't transmit anything, the refresh rate does not change (it stays at the lower rate).
Thanks!
-Chris