This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

app_time and nrf_delay_ms

Hi

I would like to read values from three AD channels where I use different sampling frequencies and different number of samples are captured at each sampling frequency. To be clear I will explain my problem in the case of one channel. Every 10s I would like to capture five values from the actual channel and subsequently the averaging is performed. The time between each sample should be around 10ms. At this stage I do not know what is the best programming solution for my problem to be the sofware as efficient as possible. Here is my current code, where I used nrf_delay_ms(10) function for waiting between two samples. I know that this is not the most efficient way because of that I will appreciate for any advice how I can improve my code.

Best regards

Samo

static void adc_lpg_timer_handler(void * p_context) // timer executed every 10s
{ 	
	
	int i;
	nrf_adc_value_t adc_value;
	uint8_t err_code;
	uint16_t ave_adc = 0;
	
	nrf_gpio_pin_toggle(LPG_PIN);	
	
	for(i=0; i < 5; i++)
	{
		nrf_delay_ms(10);
		while(nrf_drv_adc_is_busy())
		{
		}	
		err_code = nrf_drv_adc_sample_convert(&adc_lpg,&adc_value);
		APP_ERROR_CHECK(err_code);
	
	  ave_adc = ave_adc + adc_value;
  }
	 ave_adc = ave_adc/5;
	 ave_adc = 1200*ave_adc/1023;	 	 	
	 err_code = ble_adcs_2_update(&m_adcs,ave_adc);	 
	 APP_ERROR_CHECK(err_code);
	 ave_adc = 0;	   		
	 
	 nrf_gpio_pin_toggle(LPG_PIN);	
}
  • Hi Oyvind, I know this is a very old thread but I have a question that pertains to this concept.


    I have a TWI sensor that does four operations and there is a 20 ms time delay as mandated by the sensor datasheet. 

    I have gotten the code working using the nrf_delay_ms(20) function but I have since realized that it's not power efficient. So, after perusing through the forum I have decided to implement the delay using Application timers after following the tutorial. 

    My main function is,

        int main(void){
        lfclk_request();
        
        APP_TIMER_INIT(0, 4, false);
        
        gpio_config();
        
        nrf_drv_gpiote_out_clear(LED_1_PIN);
    
        err_code = app_timer_start(m_led_b_timer_id, APP_TIMER_TICKS(2000, 0), NULL);
        APP_ERROR_CHECK(err_code);
        
        nrf_drv_gpiote_out_set(LED_1_PIN);
        }

    The LED_1_PIN is the LED 1 on the Devkit. 'm_led_b_timer_id' is a single shot timer. The timer handler has nothing but a UNUSED_PARAMETER command. I am using nRF52832 with a s132 softdevice and SDK v12.2

    This doesn't give me a 2 s time delay. DO you know where I am going wrong?

    Thanks,

    Vijay

Related