Tag Archives: resistance temperature detectors

Using RTDs for Measuring Temperature

Much industrial automation, medical equipment, instrumentation, and other applications require temperature measurement for monitoring environmental conditions, correcting system drift, or achieving high precision and accuracy. Many temperature sensors are available for use like electronic bandgap sensors, thermistors, thermocouples, and resistance temperature detectors or RTDs.

The selection of the temperature sensor depends on the temperature range to be measured and the accuracy desired. The design of the thermometer also depends on these factors. For instance, RTDs provide an excellent means of measuring the temperature when the range is within -200 °C to +850 °C. RTDs also have very good stability and high accuracy of measurement.

The electronics associated with using RTDs as temperature sensors with high accuracy and good stability must meet certain criteria. As an RTD is a passive device, it does not produce any electrical signal output on its own. The electronics must provide the RTD with an excitation current for measuring its resistance. This requires a small but steady electrical current passing through the sensor for generating a voltage across it.

The design of the electronics also depends on whether the design is using a 2-, 3-, or 4-wire sensor. This decision affects the sensitivity and accuracy of the measurement. Furthermore, as the variation of resistance of the RTD with temperature is not linear, the electronics must condition the RTD signal and linearize it.

RTDs in common use are mostly made of platinum, and their commercial names are PT100 and PT1000. These are available in 2-wire, 3-wire, and 4-wire configurations. Platinum RTDs are available in two shapes—wire wound and thin-film. Other RTD types available are made from copper and nickel.

When using an RTD as a temperature sensor, its resistance varies as a function of the temperature, and not in a linear manner. However, the variation is very precise. To linearize the output of the RTD, the electronics must apply a standardizing curve, the most common standardizing curve for RTDs is the DIN curve. This curve defines the resistance versus temperature characteristics of the RTD sensor and its tolerance within the operating temperature range.

Using the standardizing curve helps define the accuracy of the sensor, starting with a base resistance at a specific temperature. Usually, this resistance is 100 ohms at 0 °C. DIN RTD standards have many tolerance classes, which are applicable to all types of platinum RTDs in low power applications.

The user must select the RTD and its accuracy for the specific application. The temperature range the RTD can cover depends on the element type. The manufacturer denotes its accuracy at calibration temperature, usually at 0 °C. Therefore, any temperature measured below or above the specified temperature range of the RTD will have lower accuracy and a wider tolerance.

The categorization of RTDs depends on their nominal resistance at 0 °C. Therefore, a PT100 sensor at 0 °C has a resistance of 100 ohms, while at the same temperature a PT1000 sensor has a resistance of 1000 ohms. Likewise, the temperature coefficient at 0 °C for a PT100 sensor is 0.385 ohms/°C, while that for the PT1000 is ten times higher at the same temperature