This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

nRF52840 Timer peripheral

I am using the peripheral_timer as a template to play with timers. I changed the timer_ticks to 8000 to generate timer interrupt at 500 micro-seconds. All other timer configurations are left at default. Also, I modified the timer_led_handler to just toggle the LED3 only to measure the timer period.

I am seeing that there is a lot of jitter (upto 200 usec) in the timing with o-scope on persistence mode.

Since this example does not have any BLE code, can I use Timer0 (there is no soft device in this example)?

Also, when I changed the timer interrupt priority to 0 or 2, I did not see any difference.

Can you explain this behavior?

  • You really can't do that. Software interrupts aren't intended for microsecond activity. It is just due to the physical requirements of what you are asking the processor to do and has nothing at all to do with Nordic or any other microcontroller SoC.

    Every time the NVIC receives an interrupt request a fairly large amount of work happens. All the local hardware registers are saved to memory, along with the current program counter and all local variables. There was a recent discussion about this and the minimum turn around for the NVIC is 12 cycles and that doesn't include any local variables. Since the interrupt forces the NVIC to save the last state of the software, now the time to service your interrupt will vary depending on where the program last was and what was going on. I always assume several hundred cycles will happen before my interrupt in serviced. And, if you are running a SoftDevice then your program interrupt will always wait until the SD is done.

    The second problem with your approach is you are using an example with gpio drivers. All the drivers utilize function calls, nested within function calls. While they make it easy to program they do not make the life of the processor easier. Instead, you should be writing directly to the gpio registers. Then you can accomplish in 1 cycle that which might take hundreds using the drivers.

    The best and most reliable way to do microsecond or even nanosecond timing is using ppi/gpiote. This allows you to do full hardware implementations of timer/gpio activity. Since the software only takes on a sort of setup and advisory role using ppi/gpiote, you won't have the issues associated with software interrupts.

    Here is a link to a recent similar discussion: devzone.nordicsemi.com/.../

  • To clarify, the purpose of this test was not to generate signals. I wanted to see what is the maximum time for which the interrupts are disabled by the soft device. Since the soft devices have higher priority, we want to know whether other tasks can meet real time requirements.

    Where can I find these numbers?

  • Are there any critical sections of code in soft device that could disable interrupts for a certain amount of time?

  • If you run the softdevice it simply locks you out until it is done. Your only interface with the SD is via API's. This is because the ble coding is highly time dependent. I should point out that all the microsecond timing in the SD is done with ppi/gpiote. This is because of the unreliable nature of software interrupts.

    According to the blog, the SD spec contains all kinds of tables on interrupt latency. However, if you have highly time dependent activities you should consider letting the ppi/gpiote architecture handle some of it. Here is a link to the S132 spec for your reference: infocenter.nordicsemi.com/.../S132_SDS_v4.0.pdf

  • @KV: If you want a period that you can be guarantee you don't have interrupt from the softdevice, you can think of using Timeslot. Timeslot is explained in Chapter 9 in the S132 spec. And there is a tutorial for it in the Tutorials section on top of devzone.

Related