
10
4 Why
calibrate?
The definition of the term “calibration” is:
To determine the difference between the object being
calibrated with a known reference and knowing the
uncertainty in the process.
Adjusting the calibrated object in order to remove the
error is of secondary importance as long as the error
is either within acceptable limits or the error is used to
calculate the real value.
Calibration is an important part of any quality
insurance routines. When a calibration routine is
created, all aspects are taken into consideration by
qualified personal so when an operator is making
calibrations later, the process is done the same way
every time and ensuring that the results are
comparable and reliable.
By documenting the process and results, you are
making the whole process traceable. This way a lot of
human errors can be eliminated.
Good quality is to achieve the accuracy you need and
not far beyond. If your demand is to keep the
temperature in you process within e.g. 300 to 310°C,
define your accuracy to 305 +/-5°C. The example
applies for TC65.
Your calibration process should now ensure that your
temperature instrument is within these limits and not
necessarily the peak performance of the specifications
of the instrument. A good rule of thumb is to double
the accuracy demand for your calibration compared to
the process accuracy demand. This leaves the accuracy
demand for your calibration in the range of +/-2,5°C.
The whole chain of elements involved in the
calibration process creates errors in the final reading.
In this matter’s errors are expressed as uncertainty.
Take the following elements into account when
considering the total uncertainty:
• Temperature drift in the calibrator
• Temperature drift in the calibrated instrument
• Temperature gradients in the calibrator well
• Differences in gradients caused by variance in
probe mass (calibration object)
• Inaccuracy in the calibrator
• Inaccuracy in the calibration object
• Variance in thermal contact between the probe
being calibrated and the calibrator well
• Reading error (if analog scale on the calibrating
object)
Remember all these elements are NOT added to
calculate final uncertainty. In real life there is not likely
that all uncertainty elements will be at the maximum
and in the same direction simultaneously! Therefore,
another way of calculation is used.