We have a custom PCB with a nRF9151 implementation supporting both LTE/NB-IoT and GNSS. The PCB has no switching regulators or any other possible noise source except for the serial interfaces that are driven by the nRF9151. In Idle Mode, the RSE measurement was typically noise floor (>20dB margin) with one frequency (~70MHz) only passing with 10dB of margin. And CAT-M TIS measurements were -104dBm. So we've found no evidence of any noise concerns in the design.
The GNSS implementation consists of ~1.5cm of 50ohm trace (total) to route to an LNA/SAW filter (SKY65943-11), then to some tuning options (which are currently bypassed), and then to a U.FL connector.
Testing was done with a SpectraCom (now Safran) GSG-5 which was connected to the GNSS U.FL connector using ~10cm of coax. Cable loss was assumed to be 1dB and the GSG-5 output was compensated accordingly.
Measured tracking sensitivity looked reasonable at -155dBm, which is within 1.5dB of the spec. Based on customer requirements, we only tested Cold-Start and Hot-Start sensitivity (and TTFF) at -130dBm and -136dBm, which is a 16.5dB relaxation from the spec.
At the Cold-Start sensitivity level (-130dBm) the TTFF was 144 seconds. This seems very poor compared to the Cold-Start TTFF of 30.5 seconds at -146.5dBm indicated in the spec.
At the Host-Start sensitivity level (-136dBm) the TTFF was 13 seconds. This seems very poor compared to the Hot-Start TTFF of 1.3 seconds at -152.5dBm indicated in the spec.
Am I reading the TTFF / Sensitivity specs incorrectly, or are there any suggestions on what to investigate to improve these TTFF numbers?