I'm currently trying to scan non-connectable advertisement for continuously.
So, I'm going to set the value of scan window and scan interval same.
For that purpose,I want to simulate the value of scan window & interval for the best
performance (for the least number of packet loss).
What will be the right approach to simulate the best value?
Additionally, if I don't have to worry about power consumption of scanner(ble central),
would it be good to set scan window as minimum value (which is 2.5ms)
Thanks in advance.
In general if you set scan window equal scan interval, then the radio will be ON about 100% of the time. There will be a short period of a few hundred us (depending on the softdevice release, don't have exact numbers) after each scan interval where it will change frequency to the next advertisment channel, during this period it can't receive any advertisement packet. It will rotate between the 3 advertisment channels after each scan interval.
Typically I recommend to set the scan window to minimum, and adjust scan interval to fit the current consumption requirements of the application. The reason I recommend a short scan window, is that it will have minimum effect on other softdevice functionality you might want to use in the application. For instance other connections will be blocked during a scan window, and a long scan window can delay execution of flash commands and timeslot api.
Thank you for your kind answer.
But, I have another issue regarding continuous scanning.
I test the scanner with many combination of scan window & interval time.
The best result I got from this test was 10% of adv packet loss.
That is too high than my expectation.
For the test, I set a beacon to advertise for every 20ms with incremented major value.
Then, the scanner scans for full time. ( tested it with minimum window to 100ms)
(I set scan interval same to scan window because there's no other radio works to do.)
Do you think I tested it wrong? because I think that loss rate is too high.
If there's any previous test regarding this situation, can you tell me the result of loss rate?
Hi, how do you ensure that you update the advertisement packet after every advertisement event?
Yes, I'm sure because I implemented a beacon to start advertising for 20ms & stop manually. I make it repeat it with updating major value. I also tested it with using only 1 channel, I set a beacon to use only 1 channel & scanner to scan only 1 channel. In that case, scanner missed very few packets. ( almost 2 of 256) and the packet with same major value was received only for once. But, using 3 channels, the result was almost 30 packets loss of 256 total. ( I didn't count the packet with same major value as received.) So, I wonder if this is expected result. Are there similar results for this kind of test?
Just note that when you set an advertisement interval of 20ms, it might actually vary up to 30ms. This from BT spec to avoid several advertisers to collide. So you should take into consideration.
It might also be that you have interference in the system, did you repeat your test with advertisement on individual channel on all 3 channels, or only for 1? Maybe there is much interference on 1 channel.