Você está na página 1de 1

Fundamental concepts of measurement 11 axis represents the frequency of the measurement result and the horizontal axis represents

the values of the results (X). The central vertical line represents the mean value of all the measurement results. The vertical line marked T represents the true value of the measurand. The difference between the mean value and the T line is the accuracy of the measurement. The standard deviation (marked x ) of all the measurement results about the mean value is a quantitative measure for the precision of the measurement. Unfortunately the accuracy defined in this manner cannot be determined, as the true value (T ) of a measurement cannot be obtained due to errors prevalent in the measurement process. The only way to obtain an estimate of accuracy is to use a higher level measurement standard in place of the measuring instrument to perform the measurement and use the resulting mean value as the true value. This is what is usually done in practice. The line (S ) represents the mean value obtained using a higher level measurement standard. Thus accuracy figures quoted by instrument manufacturers in their technical literature is the difference between the measurement result displayed by the instrument and the value obtained when a higher level measurement standard is used to perform the measurement. In the case of simple instruments the accuracy indicated is usually the calibration accuracy, e.g in the calibration of a micrometer a series of gauge blocks is used. If the values displayed by the micrometer over its usable range falls within 0.01 mm of the values assigned to the gauge blocks, then the accuracy of the micrometer is reported as 0.01 mm. It can be seen that the definition of error given previously (Section 2.2.4) is very similar to the definition of accuracy. In fact error and accuracy are interchangeable terms. Some prefer to use the term error and others prefer accuracy. Generally instrument manufacturers prefer the term accuracy, as they do not wish to highlight the fact that their instruments have errors. Relative accuracy and per cent relative accuracy are also concepts in use. The definitions of these are similar to those of relative error and per cent relative error, i.e. relative accuracy is obtained by dividing accuracy by the average measured result and per cent relative accuracy is computed by multiplying relative accuracy by 100. 2.2.8 Calibration Calibration is the process of comparing the indication of an instrument or the value of a material measure (e.g. value of a weight or graduations of a length measuring ruler) against values indicated by a measurement standard under specified conditions. In the process of calibration of an instrument or material measure the test item is either adjusted or correction factors are determined. Not all instruments or material measures are adjustable. In case the instrument cannot be adjusted, it is possible to determine correction factors, although this method is not always satisfactory due to a number of reasons, the primary one being the non-linearity of response of most instruments. For example, in the calibration of a mercury-in-glass thermometer between

Você também pode gostar