Temperature measurement is the tie that bonds virtually every type of process industry. It’s used to keep us safe – for example, when storing vaccines or transporting food as part of the cold chain. It can help us produce better products, such as with the manufacturing stages of plastics and metal production. It’s even used 30,000 feet in the air, to monitor both the engine and the cabin of aircraft.
Not surprisingly, the accuracy of temperature measuring sensors is essential –not only to ensure a quality finished product, but also to avoid potential equipment failure, product loss and reputational risk.
In this month’s blog, we’ve sat down with Lab Manager Slava Peciurov to discuss what Alpha Controls & Instrumentation needs to consider when calibrating different types of temperature measuring sensors.
1. Type of Sensor
First, not all sensors are created equal. They vary depending on the application, range and even how they are used. This means the calibration procedures can, and will, change.
Slava singles out three primary types of sensors Alpha calibrates:
• Resistance temperature detectors (RTDs)
• Thermocouples (TC’s)
RTDs are electric devices that measure temperature changes through the resistance of an electrical wire. Copper, nickel and platinum are typically used to make RTDs – with platinum (e.g. PRTs) as the most common. During a calibration, we’re measuring the resistance at different temperature points. The resistance values are required when generating new coefficients. The other way we can calibrate an RTD is by doing a comparison calibration against a master temperature standard.
Thermistors are like RTDs, but are made from a polymer or ceramic material. They come in two types: Negative Temperature Coefficients (NTCs), in which resistance decreases as their temperature increases; and Positive Temperature Coefficients (PTCs), in which the resistance increases as their temperature increases. The calibration procedures are similar to RTDs; we’re taking several different measurement values and using them to derive coefficients that can then be compared with the manufacturers’ specifications.
Lastly, thermocouples are made up of two dissimilar metal wires joined together to form a junction that, when heated or cooled, generates a voltage difference. This difference can then be measured and used to calculate temperature changes. Calibration methods vary for thermocouples, from a fixed-point method to using dry blocks or thermal baths.
2. Temperature Range
Just as each temperature sensor is used differently, so too will its ranges. For example, the metals used in RTDs have different resistance variations that’s closely tied to the temperature variations. PRTs have a range of 650°C, while nickel is 300°C and copper 120°C. Thermistors, meanwhile, can have a range from -50°C to 250°C, while thermocouples operate anywhere from -200°C to 1,750°C.
Therefore, during a calibration, Slava says Alpha will want to determine anywhere from three to five points throughout the range that the sensor is typically used within – for example, below, at and above the set point. He notes Alpha, like most calibration service providers, won’t look to calibrate the whole temperature range of the sensor. For example, if the sensors calibrated at the top of the range, it can lead to increased drifting.
Similarly, each type of temperature sensor will also have different acceptable ranges of accuracy. Slava notes calibration service providers such as Alpha follow ranges as dictated by governing bodies such as American Society for Testing and Materials (ASTM) and International Electrotechnical Commission (IEC), as well as international standards such as ISO-17025
How accurate is accurate can also be further dictated by the customer’s needs. Slava notes some of Alpha’s customers have even more stringent accuracy standards than many of the international ones outlined above. Slava notes Alpha has the capabilities at our in-house calibration lab in Markham, ON, to calibrate temperature probes within 10 millikelvin (mK) – so better than 0.01°C from approximately -200°C to 660°C for many types.
4. Immersion Depth
Immersion depth refers to how deep you need to insert the sensor into a temperature source for it to achieve sufficient accuracy. In most cases, the depth will depend on where the indicators are located within the sensor itself.
As Slava explains, all the indicators need to be engaged within the temperature source for the sensor to provide an accurate measurement. Otherwise, the temperature measurements can be less than what they actually are on high-temperature calibrations (e.g. 98°C instead of 100°C), or higher on lower temperatures (e.g. -18°C versus -20°C).
In general, the rule of thumb is a depth equal to 15 times the probe diameter plus the length of the sensing element (e.g. one to two inches). Which means for a PRT with a diameter of ¼-inch and a one-inch long sensing element would be 15 x 3/16 inches + 1 in = 3.82 inches. For sensors that have a thick wall construction, Slava notes the multiplier should increase to 20 or higher.
5. Readout Device
Just like the sensors themselves, readout devices for temperature calibrations are not considered equal. The readout is there to measure a temperature sensor’s resistance or voltage and display a temperature. The right one will depend on several different factors.
For example, what are the type of sensors being calibrated? RTDs? Thermistors? Thermocouples? How many channels are needed on the readout? What is the temperature source? Thermal bath? Dry block? Furnace? Slava adds the quality of readout devices span the gamut, with higher-end models providing better levels of accuracy.
Want to know more about how Alpha Controls can solve your calibration needs. Contact us to reach one of our trained representatives today.