You are on page 1of 2

Quality Assurance: Metrology and Calibration

The importance of measurement in our daily lives can not be overemphasized. Every new technology or science breakthrough, industrial development, or commercial success depends on one form of measurement or another. In these modern times, we practically measure everything we encounter: the weight of our food, the volume of our fuel, the distance between two points, temperature, pressure, humidity, light, current, voltage, power, speed, energy, etc. Needless to say, reliable measurement is very important to the semiconductor industry, or any industry for that matter. In fact, it is so important that there's already a science behind it. Known as metrology, it was developed and systematized to ensure that all measurements performed are meaningful and according to international standards. After all, one can not control or improve something that one can not measure, and semiconductor manufacturing is one complex game of high-precision control and continuous improvement. Good measurement relies on the integrity of the measuring equipment used. Unfortunately, no matter how sophisticated a measuring equipment is, it degrades with time due to thermal, mechanical, electrical, and environmental effects. This degradation is called drift, and it is unavoidable. However, the effects of drift on the reliability of the measurements may be offset by a process known as calibration. Calibration is simply the comparison of the measuring instrument or equipment's performance to a reference standard of known accuracy. In addition to this determination and reporting of deviation from nominal, it may also include correction (adjustment) to minimize the errors. Properly calibrated equipment provides confidence that the company's products

and services meet customer specifications all over the world. In the semiconductor industry, all critical equipment used in manufacturing are required to undergo periodic calibration. Another commonly-encountered term in relation to calibration is verification. Verification is not the same as calibration because it refers to the comparison of the measurement results against a specification, usually the manufacturer's published performance figures for the product. Calibration Traceability The basic concept behind calibration is that the measuring equipment should be tested against a standard of higher accuracy. To illustrate this, below is a typical hierarchical relationship among the various levels of calibration/measurement activities within a company: National Standard.......................... Accurate to 0.002% Calibration Laboratory........................................0.01 % Company "Master" Item.....................................0.07 % Company Production Equipment.........................1.0 % Produced Product............................................10.0 % These calibrations need to be done on a planned, periodic basis with evidence of the comparison results being recorded and maintained. The records must include identification of the specific standards used (which must be within their assigned calibration interval), as well as the methods/conditions used in the calibration process. These records should demonstrate an unbroken chain of comparisons that ends at the agency responsible for maintaining and developing a country's measurement standards (now generically known as a national metrology institute). This

demonstrable linkage to national standards, with known accuracy, is known as 'traceability'. Periodicity Periodicity refers to the specified frequency of calibration or the regular interval between calibrations defined for an equipment or a part thereof. The periodicity depends on how quickly the equipment drifts out of spec, what the equipment is for (is the application critical or not?), and what the company can afford to do, but normally it ranges from 3 months to 2 years. Periodicity may also be expressed as the calibration cycle, which is the number of calibrations required per year. Characterization Aside from calibration which involves comparison of results against a standard and possibly adjustment to nominal settings, a measuring equipment may also be characterized. Characterization pertains to the evaluation or study of parameters or properties which do not have specifications. Characterization may be useful in determining the over-all limitations and capabilities of the equipment. Uncertainty In quantum physics, the 'Uncertainty Principle' states that the exact position and exact velocity of a particle can not be determined at the same time. Because of this, one can not measure something without changing it. This uncertainty principle very well applies to metrology and calibration, i.e., every metrologist knows that there is no perfect measurement of anything. An excellent metrologist would therefore consider all factors affecting measurement uncertainty, and ensure that the total uncertainty or inaccuracy of the measurement is less than the tolerance of the specification for the parameter being measured. Basic Terminology Accuracy - how close a measurement reading is to the 'true' value of the parameter being measured Precision - how repeatable or closely-grouped the measurement readings are Resolution - the level of discrimination that the measuring equipment can show; the smallest unit change that it can discern or detect Sensitivity - the smallest change in the input (stimulus) that causes a discernible change in the output Stability - the tendency of a measuring equipment Guardbanding Low TAR's don't mean that the company has to erroneously accept actual rejects as a result of measurement errors. Guardbanding is the process of tightening the pass limits of the specification to account for the uncertainty of the measuring equipment and other factors. Guardbanding may reject some units that would otherwise be marginally acceptable, but this is still better than shipping 'bad' products to the customer. Test Accuracy Ratio Test Accuracy Ratio, or TAR, is the ratio of the specification spec of the parameter being measured to the uncertainty unc of the measuring instrument, or TAR = spec/unc. Thus, the higher the TAR the better, but a higher TAR will also cost more. The metrologist should therefore aim for the most cost-effective TAR, i.e., the lowest TAR that will meet the company's quality objectives.

You might also like