This post is older than 2 years and might not be relevant anymore
More Info: Consider searching for newer posts

BLE Software configuration for FCC Spurious Emissions Test

I have a dual mode BLE / ANT+ nRF51422 design going through regulatory testing for FCC / EU marketing approval. I have previously successfully tested a very similar design with Ant+.

One of the tests that I am required to do is the "Spurious Emissions Test" where we transmit at the lower, mid and upper points in the allowed 2.4 GHz range while the facility measures peak and average power outside the allowable range. The test facility requires us to limit our transmissions to a particular set of channels. We must configure our system to only use 2.402 GHZ, then repeat with only 2.442, then again with 2.480.

For the Ant+ design, it was trivial to simply change the channel used to meet this testing configuration.

For our BLE design, I do not see a simple way to accomplish this frequency restriction. In the S310 API I see the ability to set the ble_gap_opt_ch_map_t and restrict the channel map. But I would also have to restrict advertising as well. And even if I configured my unit under test to run our current software load over a single BLE channel, I would have to have another BLE radio in the test chamber in order to generate any BLE traffic beyond basic advertising which would result in an inaccurate test

To further complicate the matter, the test facility states that it is acceptable / "standard practice" to simply test the unit in advertising mode on these restricted channels. I believe they are misinterpreting the standard. If you only test in advertising mode, your packet rate is typically much lower than your "active connection" packet rate. If your packet rate is much lower, then your measured average power of any emissions outside the acceptable range is going to be artificially lower than the worst case of an active transmission.

We advocate a different approach. We proposed using the "radio test" example code to generate a worst case signal, where the radio is transmitting continuously a modulated signal at a particular data rate on a single frequency. The facility measures the peak and average power outside the acceptable band at this configuration. Then we do a mathematical calculation of how much lower our actual peak data rate is than this worst case and we calculate a dB offset based upon this calculated number. We base our calculation upon our number of transmission buffers and the transmission interval these packet buffers are used in. We based this approach on FCC documentation and other vendor documents

So finally to the questions

  1. Am I correct that testing only in advertising mode will result in a much lower average power than an average packet rate power level?

  2. If yes, is that a concern for FCC / EU spurious emissions or is that simply the way it is done?

  3. Is the approach of test worst case and calculate a DB margin based on packet duty cycle a valid approach.

  4. Is there a better way to accomplish this type of testing on an nRF51422 s310 version 3 / SDK 9 based design?

Thanks for your time if you read this far, David

Related