Optical Power Meter calibration and accuracy is a contentious issue. The accuracy of most primary reference standards (e.g. Weight, Time, Length, Volt etc.) is known to a high accuracy, typically of the order of 1 part in a billion. However the optical power standards maintained by NIST, are only defined to about one part in a thousand. By the time this accuracy has been further degraded through successive links, instrument calibration accuracy is usually only a few %. The most accurate field optical power meters claim 1% calibration accuracy. Comparatively, this is orders of magnitude less accurate than a typical electrical voltmeter.
Further, the in-use accuracy achieved is usually significantly lower than the claimed calibration accuracy, by the time additional factors are taken into account. In typical field applications, factors may include: ambient temperature, optical connector type, wavelength variations, linearity variations, beam geometry variations, detector saturation.
Therefore, achieving a good level of practical instrument accuracy and linearity is something that requires considerable design skill, and care in manufacturing.
August 11, 2011