A typical OPM measures accurately under most conditions from about 0 dBm (1 milli Watt) to about -50 dBm (10 nano Watt), although the display range may be larger. Above 0 dBm is considered "high power", and specially adapted units may measure up to nearly + 30 dBm ( 1 Watt).
Irrespective of power meter specifications, testing below about -50 dBm tends to be sensitive to stray ambient light leaking into fibers or connectors. So when testing at "low power", some sort of test range / linearity verification (easily done with attenuators) is advisable. At low power levels, optical signal measurements tend to become noisy, so meters may become very slow due to use of a significant amount of signal averaging.
Optical Power Meter calibration and accuracy is a contentious issue. The accuracy of most primary reference standards (e.g. Weight, Time, Length, Volt etc.) is known to a high accuracy, typically of the order of 1 part in a billion. However the optical power standards maintained by NIST, are only defined to about one part in a thousand. By the time this accuracy has been further degraded through successive links, instrument calibration accuracy is usually only a few %. The most accurate field optical power meters claim 1% calibration accuracy. Comparatively, this is orders of magnitude less accurate than a typical electrical voltmeter.
Further, the in-use accuracy achieved is usually significantly lower than the claimed calibration accuracy, by the time additional factors are taken into account. In typical field applications, factors may include: ambient temperature, optical connector type, wavelength variations, linearity variations, beam geometry variations, detector saturation.
July 22, 2011