Variance Factor and Input Scale Factor

The variance factor is at the bottom of the network adjustment report. It is the ratio between the observed residual errors and the estimated session (baseline) accuracies. Ideally, the variance factor should be 1.0. This indicates that the estimated errors correspond well to observed errors. A variance factor less than 1.0 indicates that the estimated errors are larger than the observed errors (that is, session standard deviations are pessimistic). Most often, a value greater than 1.0 denotes that observed errors are larger than estimated accuracies (that is, session standard deviations are optimistic) unless the GPS data is very clean. Thus, low variance factors are normally desired. Very large variance factors of 100+ normally indicate abnormally large session errors (that is, a very poor network fit), and you should try and investigate the source of the problem before using the coordinates produced.

The variance factor can also be used to scale the station standard deviations to more realistic values. The network adjustment is initially run with a unity scale factor. The resulting variance factor can then be inserted in the scale factor field from the first screen. After running the network adjustment with this new scale factor, you will notice larger or smaller standard deviations and that the new variance factor should now be ~1.0. This procedure will only work for a minimally constrained adjustment (that is, one 3-D control point, or one 2-D and one 1-D control point).