This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

Sub oscillator usage when using Gazell protocol

To the kind attention of Nordic support team,

We are going to use gazell protocol for some of our wireless products. Could you please point me out to the relevant documentation about how gazell protocol machine gets its time base? Do you recommend the usage of the XL1, XL2 sub oscillator for more precision? I have to say that our device is not going to have power problems, so will it suffice to have the main oscillator and trigger any sort of software correction? In this case is there some change that must be done to the code in order to inform the gazll stack about the timing source? 

Parents Reply Children
  • Thank you very much for replying us. As your documentation states gzll_arm.lib needs NRF_TIMER2 and gzll_sd_resources_arm.lib needs NRF_TIMER0. I think it may infer that the main oscillator should be fine, as the timing source. Anyway, we could have an idea about the overall transmission efficiency in an empirical way, letting two device transmit and receive for a long time and monitoring how many failures, if any. So that we could also tune gzll radio parameters in a pratical way. Is there anything else, or a laboratory mean, that you could also recommend in order to investigate the real effect of gazell radio parameters on the overal communication efficiency? Thank for your kindness

  • The gazell protocol will handle most of this automatically, but a few suggestions:

    - If you have several devices, you should only let one use synchronization, because else the devices may start to transmit at the exact same time, which will cause very poor results.

    - If you require low latency and/or high throughput, I recommend to use NRF_GZLL_DEVICE_CHANNEL_SELECTION_POLICY_USE_CURRENT by calling nrf_gzll_set_device_channel_selection_policy(). 

    - In synchronization mode wait for nrf_gzll_device_tx_success()/nrf_gzll_device_tx_failed() before adding next packet by calling nrf_gzll_add_packet_to_tx_fifo().

    Best regards,
    Kenneth

  • Thank you very much for your advises. They are really helpful. The second one and the third are clear, we understand how to put them into practice. May you please explain a little bit further the first one, please? We are going to use multiple devices and a gazell host. So it should be nice to better comprehend your meaning. It is clear that we want to theoretically avoid congestion as much as it is possible. May you please argument with a little practical example what is the gazell parameter that is key in this aspect? Thanks in advance 

  • The time a device will keep track of the synchronization with the host is configured through nrf_gzll_set_sync_lifetime(). If several devices are transmitting, and they keep synchronization and they call nrf_gzll_add_packet_to_tx_fifo() close to the same time, they will try to transmit at the same time based on the synchronization with the host timing. This will very likely cause collision on-air, where neither device are able to successfully transmit it's data, and the devices will keep trying at the same time due to the synchronization that is in place. It's only once one or both finally lose synchronization that they will be able to successfully transmit again. That is why I say that all device should in general use nrf_gzll_set_sync_lifetime() with a value of 0 to imply that the link is always out of sync. You may however choose one device that can use nrf_gzll_set_sync_lifetime() with a value different of 0.

Related