This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Estimate CPU blocking time during connection

Hi all! I have a smart beacon kit rev 1, with s110 on it (the latest firmware available). I am using the RTC with the 32 kHz crystal (50ppm) to trigger an ADC conversion every 2 ms, to have a sampling frequency of 500 Hz. I tried to raise it to 1 kHz but i found out that it's not capable of reaching this sampling (i lose samples) and i read that the problem is the CPU blocking time. My beacon is connected to a DK to send the samples, and i have the following connection parameters:

  • connection interval=10 ms
  • slave latency=0
  • 6 packet per connection interval, each one of 20 bytes

I need to estimate the cpu blocking time to justify the fact that i can't reach the 1 ms counting (and consequently the 1 kHz sampling), how can i "calculate" it? I understood that in my conditions of connection parameters, the only factor influencing the blocking time is the number of packet (the maximum allowed in my case) and their length in bytes, how these factor are correlated (maybe with approximate formulas) to the blocking time? Thanks!

Parents Reply Children
No Data
Related