This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

How to Calibrate Timer?

I use NRF52832 and sdk13.0.0 to develop my own application. I want to generate 1000 Hz Timer to trigger the ADC. I use the timer 1 to generate 1000 Hz. But when I use GPIOTE to test the frequency, I find that the frequency is 500.014 Hz(500.014 x 2 = 1000.028 Hz).I have open the crystal oscillator as HFCLK.

Because I need two devices to generate the same frequency to sample the sensor data, so I need the accurate frequency. So at least I need the two device to generate the same frequency. But when I download the same program to two device ,the frequency is different: one is 500.014 Hz ,the other is 500.012Hz. What I should do?

My timer code is:
//========================
//timer initial
//========================
nrf_drv_timer_config_t timer_cfg = NRF_DRV_TIMER_DEFAULT_CONFIG;
timer_cfg.bit_width = NRF_TIMER_BIT_WIDTH_32;
err_code = nrf_drv_timer_init(&m_timer1, &timer_cfg, timer_dummy_handler);
APP_ERROR_CHECK(err_code);
/* setup m_timer1 for compare event every 1ms */
uint32_t ticks = nrf_drv_timer_us_to_ticks(&m_timer1, 1000);//ticks = 0x3E80
nrf_drv_timer_extended_compare(&m_timer1,
                               NRF_TIMER_CC_CHANNEL0,
                               ticks,
                               NRF_TIMER_SHORT_COMPARE0_CLEAR_MASK,
                               false);

The question one is: how can I calibrate the timer to generate accurate 1000 Hz? The question two is: how to make sure the two device to generate the same frequency?

Parents Reply Children
No Data
Related