in the gps example, there is a struct
nrf_modem_gnss_pvt_data_frame
in which contains the accuracy in meters, by documentation definition, it's /** Accuracy (2D 1-sigma) in meters. */
What is the formula to calculate this accuracy value? I'm confused when we mention 1-sigma (which is the standard deviation), usually the sigma is calculated by average value of a couple of samples. however, in GPS coordinates, this value is calculated even when the 1 fix data is received. if we use average data, then the initial value of sigma should always be zero, apparently, this is not the case.