I'm trying to understand the subtleties in a Gazell link setup with multiple devices and a single host.
The documentation in the infocentre suggests minimum timeslot_period values for a given data rate. It also suggests that timeslots_per_channel be set to at least two on the host: "Since the Device and Host can not be in perfect synchronization, a transmission should [sic, is the correct word "could"] overlap to adjacent timeslots on the Host."
Question 1: Should timeslots_per_channel be set to 2 on both host and device, or just the host? To me it does not make sense for the value to be different between host and device, because that would cause the device to cycle through the channel table twice as fast as the host and sync would not be achievable.
When using two devices with a single host, it is suggested that an even number of channels be used (e.g. 4, 16, 25, 42, 63, 77), and that the host use all channels in its channel table. Device 1 would use the even index values (4, 25, 63) and Device 2 would use the odd index values (16, 42, 77). The devices would have their timeslot_period value set to 2x the value that the host uses, which maintains the overall cycle period.
Now my understanding, given that it is recommended that the host "dwell" on a single channel for two timeslots, is that the effective timeslot period will end up being 4x the minimum recommended value (2x because the transmission may "spill over" into another timeslot due to clock differences, and 2x again because the devices are hopping over half the configured timeslots.
Question 2: is my understanding correct? e.g. with two devices and one host in a 2Mbit system, where the minimum recommended timeslot_period is 600us, I should set the timeslots_per_channel to 2 (on both host and devices), set the timeslot_period to 1200us on the host and 2400us on the devices?