A multimeter or a multitester, also known as a VOM (volt-ohm-milliammeter), is an electronic measuring instrument that combines several measurement functions in one unit. A typical multimeter can measure voltage, current, and resistance. Analog multimeters use a microammeter with a moving pointer to display readings. Digital multimeters (DMM, DVOM) have a numeric display, and may also show a graphical bar representing the measured value. Digital multimeters are now far more common due to their cost and precision, but analog multimeters are still preferable in some cases, for example when monitoring a rapidly varying value.
A multimeter can be a hand-held device useful for basic fault finding and field service work, or a bench instrument which can measure to a very high degree of accuracy. They can be used to troubleshoot electrical problems in a wide array of industrial and household devices such as electronic equipment, motor controls, domestic appliances, power supplies, and wiring systems.
Multimeters are available in a wide range of features and prices. Cheap multimeters can cost less than US$10, while laboratory-grade models with certified calibration can cost more than US$5,000.
The resolution of a multimeter is often specified in the number of decimal digits resolved and displayed. If the most significant digit cannot take all values from 0 to 9 it is generally, and confusingly, termed a fractional digit. For example, a multimeter which can read up to 19999 (plus an embedded decimal point) is said to read 4½ digits.
By convention, if the most significant digit can be either 0 or 1, it is termed a half-digit; if it can take higher values without reaching 9 (often 3 or 5), it may be called three-quarters of a digit. A 5½ digit multimeter would display one "half digit" that could only display 0 or 1, followed by five digits taking all values from 0 to 9. Such a meter could show positive or negative values from 0 to 199,999. A 3¾ digit meter can display a quantity from 0 to 3,999 or 5,999, depending on the manufacturer.
While a digital display can easily be extended in resolution, the extra digits are of no value if not accompanied by care in the design and calibration of the analog portions of the multimeter. Meaningful (i.e., high-accuracy) measurements require a good understanding of the instrument specifications, good control of the measurement conditions, and traceability of the calibration of the instrument. However, even if its resolution exceeds the accuracy, a meter can be useful for comparing measurements. For example, a meter reading 5½ stable digits may indicate that one nominally 100,000 ohm resistor is about 7 ohms greater than another, although the error of each measurement is 0.2% of reading plus 0.05% of full-scale value.
Specifying "display counts" is another way to specify the resolution. Display counts give the largest number, or the largest number plus one (to include the display of all zeros) the multimeter's display can show, ignoring the decimal separator. For example, a 5½ digit multimeter can also be specified as a 199999 display count or 200000 display count multimeter. Often the display count is just called the 'count' in multimeter specifications.
The accuracy of a digital multimeter may be stated in a two-term form, such as "±1% of reading +2 counts", reflecting the different sources of error in the instrument.