Hello, I am trying to implement a Time-On-Air measurement with two nRF24L01+. I set up the devices to work at 2Mbps, 3 bytes address, 1 byte payload, 1 byte CRC. Both works with dedicated 16Mhz crystal. I am using a Logic analyzer with 25Mhz sampling for my tests.
First thing, I attempted to measure time delay between CE (rising edge) and IRQ (falling edge) (packet sent) on the transmitter device. I measured a minimum delay of 156.24 us, a maximum delay of 156.36 us. Those values are both smaller, compared to what the timing diagram in the datasheet "7.7 Enhanced ShockBurst Timing" says. The calculation, according to the formula in table 19, results in 164.5us, or 160us if I suppose that there is no 9bit packet control field.
Are the diagram and the formula correct? Is it true that Tstdby2a starts at the rising edge of CE and not after 10us?
Are those times fixed? Or how much can they vary? I suppose the internal processor runs at 16Mhz, then the variation should be a multiple of related period.
Of course my final scope is to measure the delay on the RX device. In the datasheet at page 43, figure 16, it looks like IRQ on the RX device is triggered Tirq after the end of the transmission. That is, with negligible distance between TX and RX, IRQ on the TX device should happen at the same time with IRQ on the RX device. But that's not the case, there's an average delay of 5.5us between them. Again, what is the variability of these times on the RX device?
Is there another product with similar capabilities and "deterministic" timing?
Thanks a lot for your help.