This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

What is the right factor to convert voltate using SAADC?

Hello,

I am getting really confused when using te SAADC to convert the read values into voltages.

I have seen that the example ble_app_proximity() uses the following macro

#define ADC_REF_VOLTAGE_IN_MILLIVOLTS     600
#define ADC_RES_10BIT                     1024   
#define ADC_PRE_SCALING_COMPENSATION      6   
ADC_RESULT_IN_MILLI_VOLTS(ADC_VALUE)\
        ((((ADC_VALUE) * ADC_REF_VOLTAGE_IN_MILLIVOLTS) / ADC_RES_10BIT) * ADC_PRE_SCALING_COMPENSATION)

whereas the thingy example uses the following method

#define ADC_GAIN                    NRF_SAADC_GAIN1     // ADC gain.
#define ADC_REFERENCE_VOLTAGE       (0.6f)              // The standard internal ADC reference voltage.
#define ADC_RESOLUTION_BITS         (8 + (SAADC_CONFIG_RESOLUTION * 2)) //ADC resolution [bits].
static uint32_t adc_to_batt_voltage(uint32_t adc_val, uint16_t * const voltage)
{
    uint32_t err_code;
    float    adc_gain;
    uint16_t tmp_voltage;

    err_code = adc_gain_enum_to_real_gain(ADC_GAIN, &adc_gain);
    RETURN_IF_ERROR(err_code);

    float tmp = adc_val / ((adc_gain / ADC_REFERENCE_VOLTAGE) * pow(2, ADC_RESOLUTION_BITS));
    tmp_voltage =  (uint16_t) ((tmp / m_battery_divider_factor) * 1000);
    *voltage = ( (tmp_voltage + 5) / 10) * 10;  // Round the value.

    return M_BATT_STATUS_CODE_SUCCESS;
}

Which one has to be used and when?

Thanks in advance

Related