Measurement is the act, or the result, of a quantitative comparison between a predetermined standard and an unknown magnitude.
It represents the highest possible value that can be measured by an instrument.
· Scale sensitivity
It is defined as the ratio of a change in scale reading to the corresponding change in pointer deflection. It actually denotes the smallest change in the measured variable to which an instrument responds.
· True or actual value
It is the actual magnitude of a signal input to a measuring system which can only be approached and never evaluated.
It is defined as the closeness with which the reading approaches an accepted standard value or true value.
It is the degree of reproducibility among several independent measurements of the same true value under specified conditions. It is usually expressed in terms of deviation in measurement.
It is defined as the closeness of agreement among the number of consecutive measurement of the output for the same value of input under the same operating conditions. It may be specified in terms of units for a given period of time.
It is the ability of a system to perform and maintain its function in routine circumstances. Consistency of a set of measurements or measuring instrument often used to describe a test.
· Systematic Errors
A constant uniform deviation of the operation of an instrument is known as systematic error. Instrumentational error, environmental error, Systematic error and observation error are systematic errors.
· Random Errors
Some errors result through the systematic and instrument errors are reduced or at least accounted for. The causes of such errors are unknown and hence, the errors are called random errors.
Calibration is the process of determining and adjusting an instruments accuracy to make sure its accuracy is within the manufacturer’s specifications.