Hello, DevZone!
Recently, I've read a lot of questions and replies related to the nRF5's SAADC and BLE.
Even though I found some pages discuss about the continuous sampling with use of timer, the concept is still confusing me.
First of all, I want to build a code that continuously reads voltage from single channel and send it via Bluetooth.
To do so, my colleague mixed two examples. One is ble_peripheral\ble_app_uart and another is peripheral\saadc of nRF5 SDK v16.0.0.
In the early stage of development, we found that the code is working as we expected–SAADC sampled data and they transmitted to the Android device via bluetooth.
Then, to make use of Bluetooth 5's high throughput, I modified code to sample data with 20 kHz sampling rate.
According to the throughput demo, I can get 1365 Kbps with BLE 5 high speed mode. It means that 20 kHz * 16 bits (actually 12 bits but the packet is built with byte unit) = 320 Kbps should be easy to achieve.
The problem is that my code is not so fast. I'm using 7.5ms connection interval and GAP event length is set to 6. Data length extension is already enabled and the packet consists of 240 data bytes (I don't know but MTU=247 setting keeps crashing so I lowered a little).
Currently, I found that NRF_ERROR_RESOURCES keep appears from ble_nus_data_send() and I think it's because I send data to frequently.
Then, maybe I can slow down the ADC and find out what is the maximum throughput of now.
Phew, here comes my main question:
How can I set sampling rate to the specific value?
If I lower the transfer rate, sampling rate also should be lowered down to guarantee low delay.
So I wrote the code like this
#define SAMPLES_IN_BUFFER 10 #define SAMPLES_TO_SEND 120 static nrf_saadc_value_t adc_buf[2][SAMPLES_IN_BUFFER]; static uint8_t to_snd_buf[SAMPLES_TO_SEND * 2]; static uint16_t idx_snd_buf = 0; (...) void saadc_sampling_event_init(void) { (...) uint32_t ticks = nrf_drv_timer_us_to_ticks(&TIMER_ADC, 100); nrf_drv_timer_extended_compare(&TIMER_ADC, NRF_TIMER_CC_CHANNEL0, ticks, NRF_TIMER_SHORT_COMPARE0_CLEAR_MASK, false); nrf_drv_timer_enable(&TIMER_ADC); (...) } void saadc_callback(nrf_drv_saadc_evt_t const * p_event) // Original version { if (p_event->type == NRF_DRV_SAADC_EVT_DONE) { ret_code_t err_code; err_code = nrf_drv_saadc_buffer_convert(p_event->data.done.p_buffer, SAMPLES_IN_BUFFER); adc_buf_state ^= 0x01; APP_ERROR_CHECK(err_code); for(i=0; i<SAMPLES_IN_BUFFER; i++) { to_snd_buf[idx_snd_buf*2] = adc_buf[adc_buf_state][i] & 0xff; to_snd_buf[idx_snd_buf*2 + 1] = (adc_buf[adc_buf_state][i] >> 8) & 0xff; idx_snd_buf++; } if (idx_snd_buf >= SAMPLES_TO_SEND) { uint16_t length = (uint16_t)(SAMPLES_TO_SEND * 2); notification_err_code = ble_nus_data_send(&m_nus, to_snd_buf, &length, m_conn_handle); sprintf(error_string, "Error number: %#x\n", notification_err_code); SEGGER_RTT_WriteString(0, error_string); idx_snd_buf = 0; } } } (...)
The main point of the code is that I get and store 10 ADC samples per NRF_DRV_SAADC_EVT_DONE and ble_nus_data_send() is called only if the stored data exceeds 120 samples.
By doing this, I think both the sampling rate and transfer rate decrease–Previously I got 120 samples in one scoop and sent immediately but now I need to wait 12 timer calls to send data.
In other words, I thought that the timer request ADC to fill the buffer so the sampling rate is controlled by the timer. In my case, as I set the timer tick to 100 us, the sampling rate should be 10 kHz.
But when I test with a function generator, I think the sampling rate differs with what I think.
Also, there are comments that changing the acquisition time will affect ADC's sampling rate–"use 3us acquisition time to make use of 200 kHz sampling rate.
Is there anyone can explain me how the timer tick, ADC acquisition time, and sampling rate works?
How I can get 10 kHz or 20 kHz sampling rate?
And also, why NRF_ERROR_RESOURCES appear and keep me from making use of maximum throughput?