Greetings. This is a followup to my previous question BLE Software Configuration for FCC Spurious Emissions. I did not ask the right question there and I did not want to dilute the value of the answer provided by editing my question. I did read the linked testing white paper there but it did not answer my questions.
I am at the 4th day of FCC / EU testing now and have run into some challenges. I would like the experts at Nordic to provide their opinion on how we should prepare for future tests.
The particular test facility expects us to provide a version of our software that is typical of our use case and has worst case packet rate on a particular set of channels (2.402, 2.442, and 2.480 GHz). This packet rate must be maintained without a second client radio to exchange packets with since they only want to measure the effect of our radio in these conditions. BLE is a connection oriented protocol so this is not an official capability of BLE that I am aware of. The only unsolicited packets that are sent out are advertisement packets and advertisement packets go out on a subset of channels (37, 38, and 39) and their maximum configurable rate per the standard is much lower than standard traffic. The test facility engineers claim that all of their BLE testing clients have been able to provide a modified version of their software that supports this "unsolicited BLE-like packets on a fixed frequency at a worst case rate".
The test facility then want to make average power measurements in that configuration to ensure that the average power of the harmonics and just outside the band edges is within requirements.
I only have experience with the Nordic stack / chipset but I find no support for such a mode contained within the BLE api.
- Has anyone at Nordic heard of such a test methodology? In my research, I have not been able to find any references to doing testing in this manner.
Our proposed methodology was to use the "radio test" code to generate a modulated signal at a worst case transmission power. That is, set the "m" option in radio test to 1Mb/sec, set the power level to our power level in our software, set the channel to the required frequency and send a modulated continuous signal at those settings. That would provide a "worst case, 100 % duty cycle" radio power signature. We would then provide our calculated duty cycle based on the transmission interval and the number of available tx buffers. And then based on that duty cycle we would calculate the average power of our use case by calculating a dB offset from that 100% duty cycle measurement. We were able to find numerous references to this in FCC and in competitor websites such as Texas Instruments that seem to exactly match this approach.
In the experience of Nordic engineers, is that a typical / reasonable / supportable approach to measuring average power output at the harmonics of a particular application that uses the Nordic BLE stack?
Is there a simpler / better way to make a measurement that meets the spirit / intent of FCC?
The test facility was not comfortable with taking measurements that "failed" at 100% duty cycle and then applying a mathematical correction factor from an analysis of the worst case radio duty cycle.
Since both the FCC and the European standards have regulations for average power and average power is a function of duty cycle, how do customers typically test to those standards? Do they use "radio test" at 100% duty cycle and usually "just pass", or do they apply an engineering analysis of the code to justify their duty cycle would pass with the given data, or do they write custom code that PWM's the radio?
Is there some software / SDK resource that I missed that will allow direct measure of peak and average power of harmonics that takes packet rate into account?
TL; DR How do you directly measure worst case average power output of a BLE application in a test chamber without a second radio in the chamber?
Thanks for your time, David