This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Limits to software quadrature decoder with nrf52840 with softdevice.

Hello list,

Starting from the ble_app_uart example I am trying to create 2 dc motor control loops. Instead of using the uart I use the scheduler to create a simple command loop over BLE to control the app.

There is only 1 QDEC device so I need a software decoder, but that only works when the motor is turning slowly because very soon it seems that interrupts are being missed!

I use one of the encoder inputs to create an interrupt on each transition using GPIOTE. The handler is as follows:

void encoder1Event(nrf_drv_gpiote_pin_t pin, nrf_gpiote_polarity_t action)
{
  if (nrf_drv_gpiote_in_is_set(enca1)) {
    if (!nrf_gpio_pin_read(enca2)) {
      encoder1--;
      nrf_gpio_pin_set(LED_DL3_BLU);
    } else {
      nrf_gpio_pin_clear(LED_DL3_BLU);
      encoder1++;
    }
  } else {
    if (!nrf_gpio_pin_read(enca2)) {
      encoder1++;
    } else {
      encoder1--;
    }
  }
}

Note the LED added for debugging. If the motor is running in one direction, the LED should be continuously the same, say ON.

But as soon as the interrupts are coming as fast as one per millisecond, it seems that interrupts are being missed (a lot!) as shown with the led. This is also shown via the command loop in that encoder1 does not continuously increases as it should in this case. A scope reveals that the encoder signals are as they should be.

I thinks that the only thing that could to this is the softdevice, correct?

Does this mean that interrupts are blocked for 1 msec on a regular base by the softdevice?

Thanks in advance, Sietse

  • It is a but more complicated. When using a gpio in the handler and a scope I see that all interrupts are coming through!

    There is a constant delay from the encoder edge and the toggling of the gpio of 20 microseconds. And it occasionally jumps to 150 microseconds.I find the constant 20 microsecond a very long and also strange that it is so very constant, but for my use case it is ok. The minimum distance between encoder interrupts is 200 micoseconds, so it could work.

    I will first investigate further.

  • Further investigation with scope and logic analyzer reveals that the encoder signals are not clean. When running faster they probably create the problems. So probably a problem in my hardware setup, sorry.

    But I would very much like to understand the issues with the delays I mentioned. What is happening during the 20 microsecond? This to learn whether it is feasible or not, having very short interrupts every, say, 200 or more microseconds.

    And why is it so constant? This suggests that it has nothing to do with a busy softdevice. Note that the testing is done while connected with bluetooth and no communication over bluetooth, so i would imagine the softdevice is more or less at "rest".

    I tried the tests with the gpiote interrupt levels 6 and 2. Thanks again.

  • Hi,

    I am not entirely sure I understand the question or context. How have you configured the QDEC? The sample period is important here, and the shortest period you can configure is 128 us. There is no way to get higher time resolution then that.

    Though that may not be relevant, you can see the duration of time the SoftDevice will block the application from the SoftDevice specification under Processor usage patterns and availability.

  • Thanks for the reply.

    I am not using QDEC, now only testing the software encoder using 2 gpiote interrupts.

    From 'Interrupt latency due to SoC framework' I read that there is up to 4 microseconds latency to be expected. Why am I seeing 20 microseconds? I would have thought that only a gpiote interrupt has to be redirected to the app.

    What am I doing wrong here. Nothing is happening: the softdevice is "idling" in the connected state and there can be 2  interrupts from the encoder signal. Nothing more.

  • Hi,

    I see. So you are measuring (with a logic analyzer or similar) the time from a change on a GPIO input to a change on a GPIO output? If so, 20 ms is much, I agree. If the device is in system on low power mode (after a call to sd_app_evt_wait() or _WFE()) the HFINT clock needs to be started, which adds about 4 ms. There are also some microseconds for CPU startup etc, and some CPU cycles for your logic. But it should not add up to anything close to 20 ms.

    Can you share a project that reproduce this so that I can test on my side? Please note that I will be off for Christmas holiday until beginning of January though, so my next reply will probably be delayed until then.

Related