grounding a resistor divider circuit through the pin for extreme power savings

Hi there, I have a potentially dumb or creative question that i cant seem to get a good answer on.

I am running a device that is going to be conscious of uA level power consumption. It will be in deep sleep for most of its operating life.
I want to make a low accuracy battery monitoring circuit using a resistor divider network.
I see other designs using multiple transistors to enable and disable this simple circuit. the board is only 14x14mm so i dont have space for much of anything.

I had the idea of grounding the resistor circuit through a pin on the nrf52840 set as a low output. and then when i want to disable the network for no power draw i will set the sink and ADC pins to high impedance and then deep sleep the processor.

I don't see this being done anywhere which makes me nervous.

I would be careful to make the ADC high impedance first and then the sink pin to avoid exposing the ADC pin to the battery voltage and then when i wake up the processor i would make the sink pin low output and the ADC an input.  

My understanding is that the pins will be left floating and disconnected in this high impedance state, I have seen people suggest that voltage in to a pin before vdd is connected can backfeed and cause issues with power on, then during the case of populating the raw pcb, and providing power, my question would become, " for an unprogrammed stock NRF52840, what state are the pins defaulting to?" 

Other context. I am using a QKAA BGA, the VDDH and VDD is configured to external supply. a potential scenario is the battery is charging at 4.2v, processor is off,  vddh is connnected to 4.2v, nothing on vdd. since it may not be configured yet or disabled in sleep.  now i have 4.2v through a resistor into a high impedance ADC and nothing on vdd.  Is this an issue?

  • The SAADC input impedance is something over 1MOhm, so when configured as an ADC input that will be in parallel with R2 at 6MOhm; the source impedance also affects the minimum SAADC sample time. However, if the filter capacitor is large enough and the SAADC is only configured as an ADC input pin just before taking a reading and then disconnected you will get away with a simple (or no) calibration factor. How large is large enough for the capacitor? At (say) 20uSec SAADC sample / acquisition time the capacitor should discharge into (say) 1MOhm by less than the error you can tolerate. Maybe try 100nF which should give less than 1% error provided (enabling SAADC+sample time+reading SAADC) < 100uSec.

Related