This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Error when reading VDD with SAADC

Hi,our product is ready for production and we are preparing commissioning software. 

The 52840 is powered from a 3.7V LiPo via an external switching regulator that outputs 3.3V to VDD.

We can measure the battery voltage via an external resistor divider to an analog input accurately.

We would also like to measure the 3.3V reg output during production testing with software to verify all is OK.

But when we select VDD as input to the SAADC we get a voltage value above 3.3V at about 4.4V.

We are using internal reference of 0.6V and gain of 1/6 which means we should be able to measure up to 3.6V?

I am the electronics engineer and don't write the software. I have stepped through the code with the developer and can't see anything wrong. The SAADC code is common with both battery and VDD readings apart from the input selected. We are using SDK 15.0.

Any ideas.

Thanks

Leon

Parents
  • Haakonsh,

    yes I understand your reasoning and we will look at it again.

    But, my query was really asking if our setup is valid. That is, reading VDD with the SAADC with 1/6 gain.

    I assume that internally the SAADC has a fixed full scale range of 0.6V, and an input buffer/amp is adjustable.

    So, with a gain of 1/6, this really means it attenuates the input voltage so that the SAADC sees 1/6 of the input voltage, and therefore a full scale input range of 6 * 0.6V = 3.6V.

    Can you confirm this please.

    Thanks

    Leon

  • Leon Williams said:
    o, with a gain of 1/6, this really means it attenuates the input voltage so that the SAADC sees 1/6 of the input voltage, and therefore a full scale input range of 6 * 0.6V = 3.6V.

    Yes, that is correct. 

    Also, any voltage higher than the input range will be read as the highest value of the input range. F.ex. a 3.7V signal would read the same as a 3.6V signal.  

Reply Children
No Data
Related