Agilent Technologies 34970A Switch User Manual


 
Resolution
Resolution is the numeric ratio of the maximum displayed value divided
by the minimum displayed value on a selected range. Resolution is
often expressed in percent, parts-per-million (ppm), counts, or bits.
For example, a 6
1
2
-digit multimeter with 20% overrange capability can
display a measurement with up to 1,200,000 counts of resolution.
This corresponds to about 0.0001% (1 ppm) of full scale, or 21 bits
including the sign bit. All four specifications are equivalent.
Accuracy
Accuracy is a measure of the “exactness” to which the internal DMM’s
measurement uncertainty can be determined relative to the calibration
reference used. Absolute accuracy includes the internal
DMM’s relative
accuracy specification plus the known error of the calibration reference
relative to national standards (such as the U.S. National Institute of
Standards and Technology). To be meaningful, the accuracy specifications
must be accompanied with the conditions under which they are valid.
These conditions should include temperature, humidity, and time.
There is no standard convention among instrument manufacturers for
the confidence limits at which specifications are set. The table below
shows the probability of non-conformance for each specification with the
given assumptions.
Specification
Criteria
Mean ± 2 sigma
Mean ± 3 sigma
Probability
of Failure
4.5%
0.3%
Variations in performance from reading to reading, and instrument
to instrument, decrease for increasing number of sigma for a given
specification. This means that you can achieve greater actual
measurement precision for a specific accuracy specification number.
The 34970A is designed and tested to meet performance better than
mean
±3 sigma of the published accuracy specifications.
9
Chapter 9 Specifications
Interpreting Internal DMM Specifications
417