ADC - First read is always wrong

Hi, 

I have been playing around with the nRF9160 DK trying to get it reading a range of sensors, so far with success, however I have found an issue which has stumped me. I am using ADC0 to read a voltage. Although this is intended for a pressure transducer, for debugging I have just put the 5V out through a resistive divider (10K / 10K) and the ADC successfully reads 2.5V. 

However, when the program is first run, the very first ADC reading is around 50. The next one, and all instances after, it reads correctly (around 3500 - using 3V VDD as reference and gain of 1/4, 12-bit resolution). 

I assumed this may be due to acquisition time, but I tried a few different values and nothing helped. 

I wont post the full code, but here is the config and read code

#define ADC_RESOLUTION 12
#define ADC_GAIN ADC_GAIN_1_4
#define ADC_REFERENCE ADC_REF_VDD_1_4
#define ADC_ACQUISITION_TIME ADC_ACQ_TIME(ADC_ACQ_TIME_MICROSECONDS, 10)
#define ADC_1ST_CHANNEL_ID 0  
#define ADC_1ST_CHANNEL_INPUT NRF_SAADC_INPUT_AIN0

#define BUFFER_SIZE 1
static uint16_t m_sample_buffer[BUFFER_SIZE];

static struct adc_channel_cfg channel_cfg = {
	.gain 				= ADC_GAIN,
	.reference 			= ADC_REFERENCE,
	.acquisition_time 	= ADC_ACQUISITION_TIME,
	.differential 		= 0,
	.channel_id			= ADC_1ST_CHANNEL_INPUT ,
	.input_positive 	= SAADC_CH_PSELP_PSELP_AnalogInput0,
};

static int adc_sample(void)
{
	int ret;

		struct adc_sequence sequence = {
		.channels    = BIT(channel_cfg.channel_id),
		.buffer      = &m_sample_buffer,
		.buffer_size = sizeof(m_sample_buffer),
		.resolution  = ADC_RESOLUTION,
	};

	if (!adc_dev) {
		return -1;
	}

	ret = adc_read(adc_dev, &sequence);
	if (ret) {
        printk("adc_read() failed with code %d\n", ret);
	}

	for (int i = 0; i < BUFFER_SIZE; i++) {
                printk("ADC raw value: %d\n", m_sample_buffer[i]);
				uint32_t mV = (m_sample_buffer[i] *1000)*3/4096;	//n*1000*Reference (3V) / 12bit adc
				printk("ADC mV after Divider: %u\n", mV);
				mV = mV *2;
				printk("Voltage Sensed: %u\n", mV);
	}

	return ret;
}



//WITHIN MAIN

adc_dev = device_get_binding("ADC_0");
	if (!adc_dev) {
        printk("device_get_binding ADC_0 failed\n");
    } 
    NRF_SAADC_NS->TASKS_CALIBRATEOFFSET=1;
    err = adc_channel_setup(adc_dev, &channel_cfg);
    if (err) {
	    printk("Error in adc setup: %d\n", err);
	}
	
//END

The adc_sample() function is called when button 1 is pressed, but I'm sure that shouldnt make a difference. 

Thanks, 

Damien

  • Hi Karl, 

    Thanks for the update. 

    Damien, could you confirm whether you meant that the unexpected sample happens periodically without the added kprint delay, or only during startup of the SAADC peripheral?

    When I have the printk() it only happened the first time I tried reading. Without the printk() it happened fairly often. I don't have a copy of the original code as I was just playing around with it and trying out the libraries, etc. But it was essentially exactly what was in the original question of this post, with added printk() before reading the buffer. 

    main.c would look a bit like this;

    #include <drivers/adc.h>
    #include <hal/nrf_saadc.h>
    
    #define ADC_RESOLUTION 12
    #define ADC_GAIN ADC_GAIN_1_4
    #define ADC_REFERENCE ADC_REF_VDD_1_4
    #define ADC_ACQUISITION_TIME ADC_ACQ_TIME(ADC_ACQ_TIME_MICROSECONDS, 10)
    #define ADC_1ST_CHANNEL_ID 0  
    #define ADC_1ST_CHANNEL_INPUT NRF_SAADC_INPUT_AIN0
    
    #define BUFFER_SIZE 1
    /* ADC Struct */
    static struct device *adc_dev;
    
    static uint16_t m_sample_buffer[BUFFER_SIZE];
    
    static struct adc_channel_cfg channel_cfg = {
    	.gain 				= ADC_GAIN,
    	.reference 			= ADC_REFERENCE,
    	.acquisition_time 	= ADC_ACQUISITION_TIME,
    	.differential 		= 0,
    	.channel_id			= ADC_1ST_CHANNEL_INPUT ,
    };
    
    static int adc_sample(void)
    {
    	int ret;
    
    		struct adc_sequence sequence = {
    		.channels    = BIT(channel_cfg.channel_id),
    		.buffer      = &m_sample_buffer,
    		.buffer_size = sizeof(m_sample_buffer),
    		.resolution  = ADC_RESOLUTION,
    	};
    
    	if (!adc_dev) {
    		return -1;
    	}
    
    	ret = adc_read(adc_dev, &sequence);
    	if (ret) {
            printk("adc_read() failed with code %d\n", ret);
    	}
    
    	for (int i = 0; i < BUFFER_SIZE; i++) {
                    printk("ADC raw value: %d\n", m_sample_buffer[i]);
    				uint32_t mV = (m_sample_buffer[i] *1000)*3/4096;	//n*1000*Reference (3V) / 12bit adc
    				printk("ADC mV after Divider: %u\n", mV);
    				mV = mV *2;
    				printk("Voltage Sensed: %u\n", mV);
    	}
    
    	return ret;
    }
    
    int adc_value=0;
    
    static void button_handler(uint32_t button_states, uint32_t has_changed)
    {	
    	if (has_changed & button_states &
    		BIT(0)) {
    		    adc_value = adc_sample();
    		    printk("Value - %i", adc_value);
    	}
    }
    
    void main(void)
    {
    	int err=0;
        adc_dev = device_get_binding("ADC_0");
    	if (!adc_dev) {
            printk("device_get_binding ADC_0 failed\n");
        } 
        NRF_SAADC_NS->TASKS_CALIBRATEOFFSET=1;
        err = adc_channel_setup(adc_dev, &channel_cfg);
        if (err) {
    	    printk("Error in adc setup: %d\n", err);
    	}
    	
    #if defined(CONFIG_DK_LIBRARY)
    	dk_buttons_init(button_handler);
    #endif
    	
    }

    As for the versio, it would have been V1.6.1 . 

    Thanks, 

    Damien

  • Hi,

    I just noticed that when I have CONFIG_LOG_MODE_MINIMAL enabled, ADC readings are giving incorrect values. Usually I test our devices with minimal logging disabled (CONFIG_LOG_MODE_IMMEDIATE enabled) and there it is working properly.

    I removed the CALIBRATEOFFSET calling and now it is working with minimal logging enabled too.

    Have you ( and ) tested the ADC without any calibration there?

    Regards,
    Tero

  • Hello again,

    Thank you for elaborating and providing the snippet, Damien.
    I will attempt to replicate the behavior on my end using the provided code and NCS version.

    anicare-tero said:
    I just noticed that when I have CONFIG_LOG_MODE_MINIMAL enabled, ADC readings are giving incorrect values. Usually I test our devices with minimal logging disabled (CONFIG_LOG_MODE_IMMEDIATE enabled) and there it is working properly.

    If the loggings are processed in-place/immediately it could generate a delay akin the one mentioned by Damien. However, I find it strange that my minimal example did not behave the same way when I tried to replicate this earlier. In my minimal application, the calibration was the only source of incorrect / unexpected samples - without any delay or timer interference, and it only ever happened directly following the calibration, unlike what Damien describes.

    anicare-tero said:

    I removed the CALIBRATEOFFSET calling and now it is working with minimal logging enabled too.

    Have you (DamoL and Karl Ylvisaker) tested the ADC without any calibration there?

    I did try without calibration in my minimal example as well, in which case I never saw any incorrect / unexpected samples being placed in the buffer, which is what lead me to the conclusion that this was 'as expected' when the calibration was initiated without the SAADC first being stopped, as (poorly) described in the Product Specification.

    Best regards,
    Karl

  • Hello again, Tero and Damien

    I just wanted to let you know that I have been unable to reproduce the unexpected sample / buffer shift behavior without the offset calibration present, which aligns with my previous conclusion that this is an artifact of the CALIBRATEOFFSET description in the nRF9160's Product Specification.

    Tero, If you are still seeing unexplained samples appear in your buffer, please open a separate ticket for this issue so that we may investigate that separately.

    Best regards,
    Karl

  • Hi Karl,

    Thanks for helping, I'm fine now without the offset calibration.

    BR,
    Tero

Related