You are on page 1of 7

nstructions

1. o

1
Determine the tolerance of the tool. Consult the manufacturer's technical literature to find the tool's accuracy. For example, a manufacturer may specify that the alignment of a saw is accurate to within 1/10-inch.

Locate the tolerance of the calibration standard. Refer to the technical literature for the tool or standard if you do not have the tolerance readily available. For example, a laser distance meter might have an accuracy of 6/100-inch.
o o

Reduce the ratio of calibration standard to tool accuracy. Divide the accuracy of the tool being calibrated by the accuracy of the calibration standard. For example, .1 divided by .006 equals 16.667. Express the result as the test accuracy ratio, such as 16.667:1.

thermometer resolution, accuracy and tolerance

Resolution should not be confused with Accuracy. The resolution of an instrument is the smallest value that is shown on the display. Thus an instrument that has a 0.1C Resolution means that it will read to the nearest 0.1C (Perhaps 46.6C) whereas a 1C Resolution instrument will only read to the nearest 1C (i.e. 47C). The better the resolution, the better the measurement display capability of the instrument. This does not necessarily mean that the instrument is going to be more accurate, however it does usually indicate a superior instrument. accuracy & tolerance Accuracy is the degree of conformity with the established standard; hence accuracy = zero means the display corresponds exactly to the ideal value. Accuracy is a definition of how accurate an instrument is, compared with the known temperature. It is usually accompanied by a reference to a tolerance, as it is very unlikely that anything will be exactly accurate, i.e. accuracy = zero. Tolerance corresponds to the value of inaccuracy that is inherent in the instrument by virtue of its manufacture or capabilities. Thus Accuracy will usually be stated as a tolerance, about, or either side of an absolute, or exact temperature. This tolerance may be stated as a measured amount, i.e. 0.5C, or as a percentage, i.e. 2% (at 50C the tolerance will be 1.0C but at 100C the tolerance will be 2.0C) The tolerance may also be accompanied by a reference to the final digit of the reading, and will therefore be an additional 1 for a 1 instrument or 0.1 for a 0.1 instrument, and so on. Often this will also be combined with an indication of the range of temperatures over which the accuracy is appropriate.

Therefore accuracy may be described as follows: 0.5C 1digit over the range 30 to 100C or perhaps 1% over the range 100 to 250C which is the most accurate a sundial or a watch?

The differences in tolerance v accuracy may be considered with reference to time. A sundial may have dial marks at every half hour - its resolution is strictly half an hour (although you could estimate more closely) However, its accuracy, is absolute i.e. = zero. (if properly positioned of course and its sunny!) Whereas a watch may have a second hand and therefore a resolution of 1 second, and yet the actual time, and therefore accuracy, may be totally wrong. As the saying goes A stopped watch is absolutely accurate twice a day! Tolerances should be taken into account when measuring temperature as with any other measurement. If you have a table that is going to be placed in the centre of a large room it wont matter if it is a little bigger or smaller than the ideal size. If it is a cupboard that fits into an alcove, then its tolerance may be slightly less than the alcove width, but no more. A gap for a dishwasher in a kitchen range wants to be as close as possible to the actual width to prevent gaps i.e. a very small plus, but no minus tolerance.

Tolerances

for

temperature

measurement

will

depend

on

number

of

factors.

The stability and accuracy of the instrument. Understanding of calibration tolerances and the need to apply the correction factors to the actual readings that you are taking. The importance of the process temperature - i.e. if you need to store food at a temperature of 1 to 5C, set the targets at 2 to 4C and
give yourself a tolerance of 1C. But the most important thing is to keep records, so that you can analyse the performance of both the thermometer and the process. Looking at trends enables an educated assessment of what settings are required and provides the ability to be as effective and efficient as possible.

TOLERANCE has a definite Range on + and - side. ACCURACY is the MEASURE how close to the TOLERANCE. Both are different.

Accuracy of Measure An indication of how close the result of a measurement comes to the true value. The measurement is not exact because neither the measuring process nor the instruments used is exact. Approximate number Any number that results from a measurement. Error of Measurement The error of measurement is the largest possible difference between the actual dimension or quantity and the measured dimension or quantity. It is 1/2 of the smallest fraction of aunit on the measuring device. It is not a mistake, but a way of recognizing the limits of the measuring device. Recording the error of measurement is another way of showing the precision of the measurement. Exact number Any number that results from counting. Lower Limit The smallest acceptable dimension in atolerance specification. For example, in a specification of 10cm 0.1cm, the lower limit is 9.9cm. (The upper limit is 10.1cm.) Precision A measure of how identically a measurement is repeated, without reference to a "true" or "real" value. In a measuring instrument, precision is defined as the smallest fraction or decimal division on the instrument. Precision of instruments All measuring instruments give approximate values for a measurement. The degree of precision available with a measuring instrument is considered to be the precision of the instrument. For example, a micrometer whose scale permits measurements to be made to 0.001 inch is said to have a precision of 0.001 inch.

Significant digit Those digits in a measured number - or digits in numbers obtained in arithmetic operations involving measured numbers - that we can be certain are correct. Often the last significant digit in a number is shown with a line under the number. Tolerance The greatest range of variation that can be allowed in the dimension of a manufactured part for it to be acceptable and fit together with other parts. Tolerance Interval The range from the lower limit of a dimension to the upper limit of a dimension. For example, if the tolerance is given as: 3.245 0.003" the range or tolerance interval is 0.006". Uncertainty in Measurement The differences in the value of repeated measurements, caused either by the measuring device or by the person doing the measurement, or both. Upper Limit The largest acceptable dimension in a tolerance specification. For example, in a tolerance specification of 10 cm 0.1 cm, the upper limit is 10.1 cm. (The lower limit is 9.9cm.) Variation The difference between two measurements. For example, if two measurements of the width of the doorway are made and one is 35 14/16 inches while the other is 35 15/16 inches, the variation in the two measurements is 1/16 inch.

The Difference Between Accuracy And Precision Measurement In Your Machine Shop

Your QA manager can put you to sleep explaining the difference between these two terms- but you really need to know the difference.

Accuracy describes 'close to true value;' Precision describes 'repeatability.'

Accuracy in measurement describes how closely the measurement from your system matches the actual or true measurement of the thing being measured. It is the difference between the observed average of measurements and the true average. Think of accuracy as the trustworthiness of a measurement system. Precision in measurement describes how well a measurement system will return the same measure; that is its Repeatability. As the targets above show, it is important to be both Accurate and Precise if you are to get useable information from your measurement system. But the repeatability has two components- that of the measurement system (gage) itself and that of the operator(s). Differences resulting from different operators using the same measurement device- this is called Reproducibility.

In our shops, we cannot tell if our measurement system has repeatability or reproducibility issues without doing a Long Form Gage R&R study. Gage repeatability and reproducibility studies (GR&R) use statistical techniques to identify and discern the sources of variation in our measurement system: is it the gage, or is it the operator? Gage error determined by the GR&R is expressed as a percentage of the tolerance that you are trying to hold. Typically, 10% or less Gage Error is considered acceptable. Over 30% is unacceptable; between 10 and 30% gage error may be acceptable depending on the application. Regardless- any level of gage error is an opportunity for continuous improvement.
The maximum deviation in measured Full Scale Span at Reference Temperature relative to the ideal (or target) Full Scale Span as determined from the Ideal Transfer Function. See Thermal Effect on Span.

You might also like