This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

FCC testing for BLE and software configuration

Greetings. This is a followup to my previous question BLE Software Configuration for FCC Spurious Emissions. I did not ask the right question there and I did not want to dilute the value of the answer provided by editing my question. I did read the linked testing white paper there but it did not answer my questions.

I am at the 4th day of FCC / EU testing now and have run into some challenges. I would like the experts at Nordic to provide their opinion on how we should prepare for future tests.

The particular test facility expects us to provide a version of our software that is typical of our use case and has worst case packet rate on a particular set of channels (2.402, 2.442, and 2.480 GHz). This packet rate must be maintained without a second client radio to exchange packets with since they only want to measure the effect of our radio in these conditions. BLE is a connection oriented protocol so this is not an official capability of BLE that I am aware of. The only unsolicited packets that are sent out are advertisement packets and advertisement packets go out on a subset of channels (37, 38, and 39) and their maximum configurable rate per the standard is much lower than standard traffic. The test facility engineers claim that all of their BLE testing clients have been able to provide a modified version of their software that supports this "unsolicited BLE-like packets on a fixed frequency at a worst case rate".

The test facility then want to make average power measurements in that configuration to ensure that the average power of the harmonics and just outside the band edges is within requirements.

I only have experience with the Nordic stack / chipset but I find no support for such a mode contained within the BLE api.

  1. Has anyone at Nordic heard of such a test methodology? In my research, I have not been able to find any references to doing testing in this manner.

Our proposed methodology was to use the "radio test" code to generate a modulated signal at a worst case transmission power. That is, set the "m" option in radio test to 1Mb/sec, set the power level to our power level in our software, set the channel to the required frequency and send a modulated continuous signal at those settings. That would provide a "worst case, 100 % duty cycle" radio power signature. We would then provide our calculated duty cycle based on the transmission interval and the number of available tx buffers. And then based on that duty cycle we would calculate the average power of our use case by calculating a dB offset from that 100% duty cycle measurement. We were able to find numerous references to this in FCC and in competitor websites such as Texas Instruments that seem to exactly match this approach.

  1. In the experience of Nordic engineers, is that a typical / reasonable / supportable approach to measuring average power output at the harmonics of a particular application that uses the Nordic BLE stack?

  2. Is there a simpler / better way to make a measurement that meets the spirit / intent of FCC?

The test facility was not comfortable with taking measurements that "failed" at 100% duty cycle and then applying a mathematical correction factor from an analysis of the worst case radio duty cycle.

  1. Since both the FCC and the European standards have regulations for average power and average power is a function of duty cycle, how do customers typically test to those standards? Do they use "radio test" at 100% duty cycle and usually "just pass", or do they apply an engineering analysis of the code to justify their duty cycle would pass with the given data, or do they write custom code that PWM's the radio?

  2. Is there some software / SDK resource that I missed that will allow direct measure of peak and average power of harmonics that takes packet rate into account?

TL; DR How do you directly measure worst case average power output of a BLE application in a test chamber without a second radio in the chamber?

Thanks for your time, David

Parents
  • Hi David

    Basically, you can't set the radio into the mode they want. The second best is to use "directed advertisement" on only channel. This will send a advertisement packet every 3.75 ms and will have the highest possible duty cycle. If you measure the duty cycle, you'll find it to be around 5 %.

    For harmonic measurements: All test labs that I'm aware of will apply a correction factor to the peak measurements if needed. When doing FCC testing, you can compensate with up to 20 dB depending on the duty cycle. The formula is 10*log(duty cycle), so a duty cycle of 10 % or less will give you 20 dB. This means that if you pass the peak requirement (74 dBµV/m), you will always pass the average requirement as well (54 dBµV/m) since the duty cycle of BLE never exceeds 10 % (on one channel).

    In Europe (ETSI), the harmonics are measured with a peak detector, so the duty cycle doesn't matter.

    Band edge measurements under FCC is done with a RMS detector and duty cycle compensation can't be applied here.

    Note that BLE is not to be considered a frequency hopping device. It would fail the hopping requirements in advertising mode as it's using only three channels.

    For FCC/ETSI testing, most will use the "radio test example" from the SDK and setup the radio to transmit a modulated packet (actually packets sent back to back) at three channels (2402, 2440 and 2480 MHz) plus RX on the same channels. Duty cycle is then compensated mathematically either for a known maximum duty cycle or by measuring the duty cycle on a device with the real firmware.

    When Bluetooth testing is required, DTM is the preferred way.

  • Thanks Ketil. You have confirmed my suspicions and this is a good answer for people looking to do FCC.

Reply Children
No Data
Related