How are HART transmitter calibrated?

How are HART transmitter calibrated?

As the field instrumentation in process plants is coming under more sophisticated metrological discipline, some much needed changes will be incorporated in the latest models. Most new field instruments are now smart digital instruments. The best thing that sets apart HART (Highway Automated Remote Transducer) protocol from the rest is that it shares characteristics of both analog and digital control systems.

Precision analog source/measure capability and digital communication are both required to properly service these instruments. Previously, this operation required two separate tools; a calibrator and a communicator. In the recent models, these capabilities are available in a single HART Process Calibrator. This technological advancement allows technicians to service a HART instrument quickly and effectively.

What is the HART protocol?

HART stands for Highway Addressable Remote Transducer.

The HART protocol uses 1,200 baud Frequency Shift Keying (FSK) based on the Bell 202 standard to superimpose digital information on the conventional 4-20 mA analog signal. The HART Communication Foundation is maintained by an independent organization. The HART protocol is an industry standard that has been developed to define the communications protocol between a control system and intelligent field devices.

With over five million HART field instruments installed in over 100,000 plants worldwide, HART is the most widely used digital communication protocol in the process industries.

HART Protocol:

  • Is compatible with traditional analog devices.
  • Uses multidrop networks and supports cabling savings.
  • Is backed up by all of the major vendors of process field instruments.
  • Utilizes smart instrument networks and reduces operation costs through improved management.
  • Preserves present control strategies and allows conventional 4-20 mA signals to co-exist with digital communication on existing two-wire loops.
  • Provides vital insight for installation and maintenance. For instance, Tag-IDs, measured values, range and span data, product information and diagnostics.

How are HART instruments calibrated?

Calibration of an analog transmitter doesn’t cause any hassles and is fairly straightforward. With the help of an As-Found test and by using the zero and span adjustments, it’s easy to set the correct relationship between the input signal and the 4 – 20 mA output. On the other hand, As-Left test completes the calibration process.

How are HART transmitter calibrated-1-Eastsensor Technology

Since a HART instrument has three distinct stages, it is classified as a more complex process. The relationship between an input sensor and the PV, or primary variable, is set with the help of the sensor input stage. The PV is denominated in engineering units, for instance, psi or oF. By digitally trimming Sensor Trim, it’s easy to adjust the Sensor Input stage.

The second stage is a computational stage, which establishes the relationship between PV (Primary Variable) and PVAO (Primary Variable Analog Output). By assigning the PV Upper Range Limit and Lower Range Limit values, the range is scaled, where PVAO is a digital value of the 4-20 mA output signal.

The final stage, also known as the Instrument Output, is set digitally with Output Trim. Using a HART configurator or communicator, these trims and the process of entering the URV and LRV is performed. A separate calibrator provides the precision analog source and measures functions for accurate readings.

How are HART transmitter calibrated-2-Eastsensor Technology

Depending on how the transmitter outputs are used, the calibration approach for a HART instrument is determined. It may be treated as an analog transmitter, if only the 4-20 mA analog signal is used. The correct relationship between input sensor and 4-20 mA analog output are set using the manual zero and span buttons on the transmitter. Using the digital setting is another way to get the PV LRV and PV URV,

In this scenario, however, it’s plain to see that the Sensor Input stage has not been adjusted properly. If a communicator is used to read the digital value PV, it is likely to be incorrect. The result will be the same even if the 4-20 mA output is correct.

A more rigorous approach is required if any of the digital signals will be used by the control system. If the system uses PV, Sensor Trim must be used to set the input stage correctly. The next step is to digitally assign PV LRV and PV URV and to avoid using the manual zero and span buttons to change the readings. For the final step, the Output Trim is used to correctly set the relationship between the PVAO and the 4-20 mA analog output.

Is HART calibration required?

Ever heard that the accuracy and stability of HART instruments eliminate the need for calibration or that calibration can be accomplished by re-ranging field instruments with a HART communicator? That’s a common misconception that needs to be set right. Another misconception is that smart instruments can be remotely calibrated by the control system. None of these statements are true! 

The truth is that all instruments drift. Calibration doesn’t mean re-ranging with just a communicator. A precision calibrator or standard is required to do an efficient job. Regular performance verification with a calibrator that is traceable to national standards is necessary due to:

  • Regulations governing consumer safety, environmental protection, and occupational safety.
  • Quality programs such as ISO 9000 standards for all instruments that have a profound impact on product quality.
  • Shifts in performance of electronic instruments over time, which can be due to exposure of the electronics. These discrepancies also occur if the primary sensing element is exposed to field environmental factors such as temperature, humidity, pollutants, and vibration.
  • Commercial requirements such as weights, measures, and custody transfer. Since performance checks might discover problems that are not directly related to instrumentation, such as solidified or congealed pressure lines, installation of an incorrect thermocouple type, or other errors and faults, regular calibration is a sensible option.

A calibration procedure consists of three major steps. The first one is the verification (As Found) test. Next comes the adjustment to within acceptable limits, if necessary, and the last step entails a final verification (As Left) test, if any adjustments have been made. Data from the calibration is collected and then used to compile the calibration report that documents instrument performance over time.

How are HART transmitter calibrated-3-Eastsensor Technology

All instruments, even HART instruments, have to be calibrated on a regular basis. This preventive maintenance schedule enables a smooth process. In order to ensure that an instrument never drifts out of tolerance, the calibration interval should be short enough. It’s also essential to ensure that the interval is long enough to avoid unnecessary calibrations. Alternatively, critical process requirements, such as calibration, before each batch can also determine the interval.

Find out: HART Protocol EST4300 Smart Pressure Transmitter in our Shop 

You may also interest in: 

Current Trim – Smart Transmitter Calibration Tutorial Part 3

Current Trim – Smart Transmitter Calibration Tutorial Part 3

Transmitter Output Current Trim (Analog Trim)

The analog output current circuitry of a 4-20 mA transmitter is quite steady, so it is a rare occurrence when it drifts. In case the analog output current is incorrect; it’s possible to use current trim to correct the analog output signal. In case the analog output current is 4.13 mA, instead of the desired 4.00 mA, then it’s recommended to use current trim to adjust the current.

But what is it used for?

Current trim is used to match the transmitter analog output current to the current input of the analog input (AI) card channel on the DCS. If the transmitter shows 0.00% but the DCS reading states 0.13%, it’s easy to determine that the difference is because of disparity in current calibration. It might be because the DCS does not support current trim of channels in the AI and AO cards. If there is drift in the DCS input circuitry A/D conversion or D/A conversion and output circuitry, current trim must be performed in each device separately.

Current trim is only applicable to a transmitter with 4-20 mA analog output. It is valid for 4-20 mA/HART transmitters but not for FOUNDATION fieldbus (FF), PROFIBUS-PA, or WirelessHART transmitters. The discrepancy exists because pure digital transmitters have no 4-20 mA analog output.

The technician needs to measure the physical output current from the transmitter in order to get an appropriate reading for current trim. Therefore, it is required that the technician either does current trim in the field at the process location by connecting a multimeter to the transmitter test terminals or bring back the transmitter into the workshop to perform current trim. If the former option is chosen, the technician will require a handheld communicator.

Trim Quick Reference

The table given below summarizes the difference between sensor trim, range setting, and current trim.

TASK
LOCAL CENTRAL
EXAMPLE
4-20mA-HART
FOUNDATION Fieldbus
PROFIBUS
Sensor TrimLocal Correct the sensor reading to applied input. For instance: if pressure is 0 bar but transmitter reading is 0.03 bar, then sensor trim is used adjust it to 0 barYESYESYES
Range SettingLocal or Central Set the 4mA and 20mA points, For instance, set range of pressure transmitter to get 4-mA when input is 0-bar and 20mA when pressure is 40bar.YESNO*NO*
Current TrimLocal Correct the analog output current. For instance, if the analog output current is 4.13mA when it should be 4mA, then current trim is used to adjust it to 4mA.YESNoNo

Range Values and Limit Summary

The table given below summarizes the relationship between range values and limits.

LSVLower Sensor Limit Lowest possible value for the 4mA point
LRVLower Range ValueThe 4mA point
URVUpper Range Value The 20mA point
USLUpper Sensor LimitHighest possible value for the 20mA point
SpanURV minus LRV
ZeroSame as LRV
Turndown Span divided by URV

Current Trim - Smart Transmitter Calibration Tutorial Part 3-2.1-Eastsensor Technology.psd

Valve Positioner Setpoint Current Trim 

Just like in the case of analog output current circuitry, it is rare for the setpoint input current circuitry of a 4-20 mA positioner to vary and drift. Even if it does drift, there is always a reason behind this discrepancy. If the input current sensing is incorrect, it’s possible to use current trim to correct the input signal. For example, if the current input reads 4.13 mA instead of the desired reading of 4.00 mA, current trim should be used to adjust it so that the setpoint reads correctly.

Current trim is used for matching the positioner current input to the analog output current of the analog output (AO) card channel on the DCS. For example, if the DCS PID output is 0.00%, it’s possible that the positioner setpoint shows 0.13% because of the disparity in current calibration.

Positioners with 4-20 mA input are the only ones that current trim is applicable for. It is valid for 4-20 mA/HART positioners, but not for FOUNDATION fieldbus (FF) positioners. The discrepancy exists because pure digital positioners have no 4-20 mA input.

In order to get an accurate current trim, the technician is required to connect a precision current source. This value can also be obtained by measuring the physical input to the positioner. Hence, it is important that the technician either does current trim in the field or bring the valve back into the workshop to perform current trim. If they decide on the former, they’d have to use a handheld communicator to get current trim in the field.

Valve Positioner Travel Stroking (Position Feedback Sensor Trim) 

In order to find a valve positioner’s fully opened and fully closed positions, stroking is used. This is an automated procedure that trims (calibrates) the position transmitter feedback sensor. Simply put, it is just like a sensor trim for a pressure or temperature transmitter. Here, however, a known reference isn’t required. The positioner automatically stokes the valve over its full travel in order to discover the open and closed end-positions.

Similarly, the analog 4-20 mA actual valve position feedback current output of a 4-20 mA positioner is calibrated in the same way that a 4-20 mA transmitter is trimmed. It’s important to note here that the process is not required for FOUNDATION fieldbus positioners or for position feedback transmitters that are based on WirelessHART.

Sensor Trim Procedure 

Plants utilize a mix of transmitters to cater to different measurement needs. These devices are from different manufacturers and so deviations in measurements are expected. It’s a fact that all sensors drift, so at some point in time, all sensors need a trim to ensure accurate measurement. However, the procedure for calibration depends on the type of transmitter being used:

  • Pressure transmitter: Apply pressure from the calibrator or dead weight tester for calibration. Equalize the manifold for zero trim.
  • Temperature transmitter: Use calibrator or resistance decade box to apply milli-voltage or resistance.
  • Flowmeter: Must be calibrated against prover or master meter.
  • Valve position transmitter: Stroke the valve fully opened and fully closed.
  • pH transmitter: Test the pH sensor in buffer solutions.

Just like calibration, the procedure for sensor trim may also vary slightly from one manufacturer to another. The discrepancy depends on the requirement for the sensor technology being used. There are some incidences when calibration is easier in the workshop. Take pH sensor buffering for example. In order to complete the process, the pH sensor has to be put in buffer solutions and distilled water to increase convenience; this can easily be carried out in the lab. With the development of smart pH sensors, the process has become even simpler. These sensors have a memory chip inside, which enables calibration in the lab, then transporting the sensors into the field, carrying the calibration offset and slope data inside its memory chip. Once it is connected, the pH transmitter/analyzers easily upload the calibration data from the sensor memory, thus increasing convenience.

Sensor Zero Trim Calibration

Typical steps in a sensor zero trim calibration for a pressure transmitter are as follows. (Assume HART operating from Remote System)

  1. Instruct the technician to inform operations about putting the associated control loop in manual. This ensures that the control is not upset when PV changes as the change in sensor readings also occur.
  2. Inform the technician about the expected changes in the sensor reading.
  3. Instruct the technician to apply zero physical input (e.g., by isolating, equalizing, and venting the manifold).
  4. Instruct the technician to wait and observe as the sensor reading stabilizes and the transmitter corrects it.
  5. Inform the technician that the zero sensor trim was successful.
  6. Instruct the technician to inform operations to put back the associated control loop in automatic.

Find out: HART Protocol EST4300 Smart Pressure Transmitter in our Shop 

You may also interest in: 

Range Setting – Smart Transmitter Calibration Tutorial Part 2

Range Setting

The process of setting the scale for the 4 mA and 20 mA points is called Range setting (re-ranging). “Calibrated range” or “calibration range” are two other names given to the scale. This defines how the transmitter output is at 4 mA and what input is required to achieve this reading. 4 mA is also known as the Lower Range Value (LRV) or as “zero” meaning 0%. The reading at which the input is 20 mA is the Upper Range Value (URV), also known as “full scale”, meaning 100%. People often confuse the word span and take it along the same lines as URV, but that completely changes the meaning of the concept. The magnitude of difference between the URV and LRV is referred to as as span. Take this example to elaborate on the concept. When the LRV is 20 and the URV is 100, the span will be 80. When using Fieldbus, PROFIBUS, and WirelessHART, range setting is not required for such devices in most applications so there’s no need to use 4-20 mA.

It’s important to note and remember that a firmware in the transmitter microprocessor is used to calculate what the output current value should be. The mathematical function ensures accurate reading.

Internally, the 4-20 mA/HART transmitter will be:

Percentage = (PRIMARY_VARIABLE – LRV) / (URV – LRV) * 100 [%]
Analog Current = (PRIMARY_VARIABLE – LRV) / (URV – LRV) * 16 + 4 [mA]

Range Setting - Smart Transmitter Calibration Tutorial Part 2-1-Eastsensor Technology

Internally the 4-20 mA control system, recorder, or indicator computes:

Percentage = (Current – 4) / 16 * 100 [%]
PV = (Current – 4) / 16 * (URV – LRV) + LRV [E.U.]

LRV to URV range limits the analog output of a transmitter. Hence, the analog output does not benefit from the full LSL to USL capability of the sensor.

Fig: The measurements derived in an analog signal system are restricted within range values.

On the other hand, the LRV to URV range does not confine FOUNDATION Fieldbus, PROFIBUS, and WirelessHART transmitters as well as the digital output of 4-20 mA/HART transmitters. However, the full LSL to USL capability of the sensor does benefit them.

Fig: The measurements derived in a digital bus signal system utilize full sensor limits.

Transmitter range setting can be done from a central location, remotely since it doesn’t require any input. Range must be set within the Lower Sensor Limit (LSL) and Upper Sensor Limit (USL). For example, when the n input is 0 bar, just set the range of pressure transmitter to get 4 mA and 20 mA when the pressure is 40 bar. Span is also included here, though, the input is 0 bar, just set the range of pressure transmitter to get 4 mA and 20 mA when the pressure is 40 bar. Span is also included here, though, transmitters usually have a minimum span.

In order to get high analog output resolution and percentage accuracy the difference between URV and LRV must exceed the minimum span. In case it doesn’t, the results will be too poor. This difference occurs because the quantization error gets amplified too much with a small span.

The physical restrictions of the sensor mostly control the sensor limits. The sensor limits are always read-only since they cannot be changed. There’s a diverse range of sensors, each with a different set of limits. Take the various RTDs and thermocouples, for example. In order to accommodate the range of the application in temperature applications, a sensor type is selected that has sufficient sensor limits.

To get wider sensor limits and to accommodate a wider range, it’s essential to purchase a new sensor. This becomes necessary because the range limits are physical and cannot be changed. Likewise, pressure transmitters can be set at different sensor module limits that range from low to high pressure. In order to get wider sensor limits and to accommodate a wider range, it is necessary to purchase a new sensor.

ANSI/ISA–51.1 Definition of Terms

Zero elevation: For an elevated-zero range, the amount of the measured variable zero is above the lower range-value.

Zero suppression: For a suppressed-zero range, the amount of the measured variable zero is below the lower range-value.

Range setting can only be used for transmitters with 4-20 mA analog output. It is not applicable for pure digital solutions like FOUNDATION fieldbus (FF) or WirelessHART transmitters. Since FF and WirelessHART transmitters have no 4-20 mA analog output, the need to set 4 mA and 20 mA range points doesn’t even exist. When using 4-20 mA systems, it’s necessary to set the range in both the transmitter and controller.

When using FF and PROFIBUS, the range is set in the controller. This negates the need to set the range in the transmitter. This change is often confusing for some beginners. But it’s important to remember that there is an exception here, which means that FF, WirelessHART, and PROFIBUS transmitters may be for differential pressure (DP) flow and level measurement, where the end-points of the DP scale (e.g. 0250 inH2O in XD_SCALE) and corresponding flow or level scale (e.g. 0-400 bbl/day in OUT_SCALE). The step enables DP transmitters to indicate in flow or level units locally. Although FF and PROFIBUS devices provide the option of setting a range in the transmitter, it isn’t always used for the application.

Fig: An analog signal system requires range, current trims, and scaling, whereas a digital bus system does not need any of these.

Nevertheless, when purchasing FOUNDATION fieldbus (FF) and WirelessHART transmitters for sizing purposes, the nominal operating range must be specified. This enables the device supplier to pick the appropriate sensor model. The desired engineering unit in the device must also be selected. There may be a need for the DCS to get a range set in the database as scaling end-points for bar graphs and trend. Even though there is no range in the FOUNDATION fieldbus or WirelessHART device, the DCS will also need a range for PID control. For control applications, the level is usually expressed in percentage of full tank.

Engineering unit is the output of both the FF transducer block and the AI function block. In order to get the PV for most applications, it’s not always necessary to set range in either block. On the other hand, there are many systems that use the range in the FF transmitter AI block. They use it to scale the faceplates bargraphs. If the need arises to increase the resolution of the faceplate bargraph, a narrower range is set to achieve the desired result. For instance, if a range is set in the AI block, the percentage of range can be seen from the FIELD_VAL parameter.

Range Setting - Smart Transmitter Calibration Tutorial Part 2-5-Eastsensor Technology

Fig:  Digital transmitters internally operate in engineering units

There are typically two ways to set the range of the transmitter:

  • Direct numeric value entry
  • To applied input

Direct numeric value entry

When the desired lower and upper range values are simply entered in from device software or handheld field communicator, the process is simply known as direct numeric value entry. The range values are then sent to the transmitter to start the process. To understand this better, take the example of entering the 20 to 100 kPa.

To applied input

Range setting to applied input requires a physical input corresponding to the desired range value to be applied to the transmitter. Most of the times, this is used in level measurement applications. Since the mounting (datum) of the level transmitter plays an integral role here, it’s recommended that the range be adjusted at site. Setting it in a lab won’t get accurate results. In short, it is a zero cancelation such as DP wet leg.

In order to complete the process, the tank is emptied to its lower level first. The next step is to send the “set PV LRV command” to the transmitter. This step helps set the lower range value to whatever the recommended input is. In the case of a DP level transmitter, if the pressure is 20 kPa when the tank is empty (the pressure tap is slightly below the datum), this becomes the new lower range value. This way, it’s possible to get a 0% reading, and the analog output current is 4 mA.

To get the opposite reading, the tank is filled to its upper level and then the “set PV URV” command is sent to the transmitter. This sets the upper range value to accommodate the required input. When the pressure is 100 kPa with a full tank, this becomes the new upper range value, thus ensuring the reading is 100% and analog output current is 20 mA. In between, the reading is linear. It’s important to note here that the technician need not know what the physical input is. He just needs to know that the tanks are full and empty.

In order to cancel wet-leg for DP transmitters in all kinds of application including flow, the set PV LRV command is quite common. These commands are the same as pushing the ‘zero’ and ‘span’ buttons that are found on some transmitters.

ANSI/ISA–51.1 Definition of Terms

  • Range:The area between the confines within which a quantity is measured, received, or transmitted. This is expressed by stating the lower and upper range-values.

For example:

  1. 0 to 200°F
  2. –20 to +150°F
  3. 20 to 150°C
  • Range-value, lower (LRV):The lowest value of the measured variable that a device is attuned to measure.
  • Range-value, upper (URV):The highest value of the measured variable that a device is attuned to measure.
  • Range-limit, lower (LSL):The lowest value of the measured variable that a device can be attuned to measure.
  • Range-limit, upper (USL):The highest value of the measured variable that a device can be attuned to measure.
  • Span:The algebraic difference between the upper and lower range-values.

For example:

  1. Range 0 to 200°F, Span 200°F
  2. Range –20 to 150°F, Span 170°F
  3. Range 20 to 100°C, Span 120°C

Note: FOUNDATION fieldbus uses the term “scale” in place of “range”

Find out: HART Protocol EST4300 Smart Pressure Transmitter in our Shop 

You may also interest in: 

Sensor Trim – Smart Transmitter Calibration Tutorial Part 1

You can carry out calibration through a handheld communicator in the field, a laptop on the bench in the workshop, or an intelligent device management (IDM) software.

Device manufacturers use Electronic Device Description Language (EDDL) to set the format that the system uses to display device information and functions to technicians. It is because of the technology that calibration of smart transmitters and other intelligent devices has become easier in recent times.

With the help of this tutorial, understanding the common principles of calibration, re-ranging, and trimming becomes easier. Since the same principles also apply to various kinds of transmitters, it pays off to understand the factors that differentiate them. As the measurement done, sensing principle, and manufacturer varies, the detailed procedure also experiences some changes.

Calibration

According to the definition, the term “calibrate” can mean several things:

  1. Set the range (scale)
  2. Trim (correct) the sensor (transducer) reading against a standard

3) Compare the sensor (transducer) reading to a standard output and observe the maximum level of error without correcting (trimming) it. This step is often done in five points, increasing and decreasing. To know for sure if the transmitter is trimmed or not, take a look at the error readings. If the error is consistently too high, the transmitter may be trimmed or replaced.

ANSI/ISA–51.1 Definition of Terms

Calibrate: To ascertain the output of a device corresponding to a series of values of the quantity which the device is to measure, receive, or transmit.

The data obtained can be used to:

  1. Determine the sites where scale graduations must be placed;
  2. Adjust the output and regulate it to a value within the specified tolerance;
  3. Ascertain the error by comparing the device output reading against a standard.

Calibrating Smart Transmitters

The term ‘calibration’ is a misunderstood word that is often taken out of context, especially when talking about smart transmitters. When analog transmitters were used predominately, calibration was understood under a different context. It referred to applying a physical input and using the trim potentiometers to adjust the transmitter. This step allowed the analog output current to come close to the desired measurement range.

It wasn’t until the smart transmitters appeared that the “calibration” process was categorized into three parts:

  • Sensor trim
  • Range setting (re-ranging)
  • Current trim

There’s a valid reason for separating these functions. With the introduction of smart transmitters, there is no need to apply physical input when changing the range. This change has been time and cost effective and is the real reason behind the rapid success of smart transmitters. It’s important to understand that “sensor trim” and “range setting” are two different concepts. Even though both of them are a part of calibration, their definition varies and so does their function. The popular opinion is that range setting is closer to the definition of configuration than calibration.

Sensor Trim 2-Eastsensor Technology

Sensor Trim (Digital Trim)

All sensors tend to drift over time and the reasons can vary. The changes might occur because of extreme pressure or temperature, vibration, material fatigue or contamination. There are several other factors that can have an impact on the sensor reading such as the mounting position.

Sensor trim is useful in correcting the digital reading visible on the device’s local indicator LCD, which is received through digital communication. For instance, if the pressure is 0 bar but the transmitter reading shows 0.03 bar, the sensor trim will be used to adjust it back to 0 bar.

Sensor trim is also useful in optimizing performance over a smaller range than was originally trimmed in the factory.

The basic principle for calibration (sensor trim) of all transmitters follows the same pattern;

  1. Apply a known input
  2. Inform the transmitter what it is
  3. The transmitter calculates internal correction factors
  4. The transmitter uses these new factors to compute a new correct measurement reading.

The technician needs to apply a physical input to the transmitter for sensor trim. Therefore, it’s essential they do sensor trim in the field at the process location. In case that option isn’t viable, the transmitter needs to be brought back into the workshop to perform sensor trim. This applies to 4-20 mA/HART, WirelessHART, FOUNDATION fieldbus, as well as PROFIBUS transmitters. The easiest way to do sensor trim in the field is by using a handheld communicator connected to the running bus, which is supported by 4-20 mA/HART, WirelessHART, and FOUNDATION fieldbus. For PROFIBUS-PA, however, there are two ways to perform sensor trim. Sending the trim command from the control system or temporarily disconnecting the transmitter from the running bus.

Typically, there are three forms of sensor trim:

  1. Zero sensor trim
  2. Lower sensor trim
  3. Upper sensor trim

Zero trim requires the physical input applied to be zero. This form of sensor trim is often used with pressure transmitters for best accuracy. Lower and upper sensor trims are used to find out the sensor trim in two points, close to lower range value and upper range value.

In order to perform the sensor trim, a known physical input is applied to the transmitter. The technician enters the applied value (on a computer or handheld communicator), which allows the transmitter to correct itself. The transmitter stores the physical input values applied for lower and upper sensor trim, which are referred to as Lower Sensor Trim Point and Upper Sensor Trim Point respectively.

To perform an accurate sensor trim, precise input has to be applied. This can be done with the help of the factory calibration equipment, which is usually more accurate than the portable calibrators at site. Since the transmitters in current use are typically very stable, sensor trim of brand new transmitters is rarely done at the time of commissioning.

It’s important to note that sensor trim is not done in the sensor itself but in the firmware of the transmitter microprocessor. The trim is a mathematical function that adjusts numerical bias and gain factors. It is the sensor reading after the A/D conversion that is trimmed, not the sensor hardware.

Sensor trim is the aspect of calibration, which this article focuses on. That is:

  • Level calibration
  • Flow calibration
  • Pressure calibration
  • Temperature calibration, etc.

Sensor Trim Points

The purpose of the (CAL_POINT) parameters is to tell the location of the pints where sensor trim was last performed and to start the process again, if required. If the sensor trim reading parameters are 0 and 360 mbar, it means these are the points at which it was calibrated.

Since the transmitter extrapolates after the reading, it might not provide accurate readings for measurements ranging from -600 to +600. This is not uncommon, so you can make do with these measurements. It is possible to achieve greater accuracy if sensor trim is performed at -600 and +600.

It’s important to note that since sensor trim points are NOT range configuration parameters they cannot be ‘set’. Configuration parameters are written when sensor trim is performed. The transmitter then remembers these points where the trim was made. Usually, there is a sensor trim wizard (“method”) that guides the technician through the calibration process step by step. The sensor trim wizard is also the technology that writes the sensor trim point parameters.

Find out: HART Protocol EST4300 Smart Pressure Transmitter in our Shop 

You may also interest in: 

Some Important Transmitter Calibration Terms You Need to Know

The term ‘calibration’ is a misunderstood word that is often taken out of context, especially when talking about smart transmitters. In order to fully understand the further concepts, it’s essential that you comprehend the meaning of some important terms used for transmitter calibration.

Since people usually get confused about the difficult terms, they are unable to differentiate them. Some of the most common areas where people encounter difficulty are understanding to differentiate the terms sensor trim, range setting, and current trim.The information you get here will help to eradicate this issue and enable you to grasp the concepts.

Technological advancement has made sure that industries thrive and that’s why plants have a diverse range of devices— from various manufacturers to promote calibration.

There was a time that these transmitters required technicians to perform manual tasks, which included changing the range by stepping into the field. There were also incidents when technicians had to work in collaboration, with one worker being in the field for sensor trim and the other one being in the control room to monitor the software.

Times have changed now, and Transmitter Calibration can easily be performed with the help of handheld field communicator or an intelligent device management software based on EDDL (Electronic Device Description Language). The choice of device depends significantly on the requirements of the task.

With the help of EDDL technology and its features, transmitter calibration is now easier than ever before. The user guidance such as wizard and the in-depth knowledge provided by the device manufacturer’s experts ensure that users don’t face any issues during the process. The information horde results in lower maintenance cost and better performing devices.

When easy field work is the ultimate goal, EDDL technology plays its role. It supports small portable field communicators and provides results that are unrivalled.

What is Calibration?

Calibration can be defined with the help of three different concepts; sensor trim, range setting (re-ranging), or current trim.

Transmitter calibration-Eastsensor Technology

These three terms have been explained intensively in the content below and you’ll also find them in the calibration tutorial. In order to enhance your comprehension, the terms have been explained individually.

What is sensor trim?

Sensor trim is the rectification of the digital reading from the sensor after the A/D conversion.

What is transmitter re-ranging?

Transmitter re-ranging refers to the configuration of both the lower and upper range values corresponding to the input values. The transmitter output at this time shall be 4 mA and 20 mA for lower and upper range, respectively.

What is current trim?

Current trim is the correction of the analog output from the transmitter.

How do I calibrate a smart transmitter?

In order to calibrate a smart transmitter, it’s important to have a handheld field communicator, laptop with interface and software, or a computer. With its help, sensor trim, range setting (re-ranging), or current trim for a smart pressure, temperature, level, flow or another measurement transmitter can be controlled. Physical input is applied for sensor trim and not for range setting. For the current trim, the output must be measured.

How do I set the range in a smart transmitter?

To set the range for a smart transmitter, you need a handheld field communicator, laptop with interface and software, or a computer. It controls range setting (re-ranging) for a smart pressure, temperature, level, flow or other measurement transmitter. You can change the range settings as a direct numeric value entry. However, in some devices, you will need to use applied input, which is useful in a handful of applications.

Do I need to calibrate a fieldbus transmitter?

Since all sensors have the tendency to drift over time, regardless of the output signal the transmitter has. In accordance with this property, often the sensor trim will be required for 4-20 mA/HART, FOUNDATION fieldbus and PROFIBUS-PA devices alike. However, for FOUNDATION fieldbus and PROFIBUS-PA devices, you don’t need to set the range in most applications. And since there is no 4-20 mA, they do not need current trim.

What is turndown?

Turn-down is the ratio of the smallest permitted span to the Upper Sensor Limit. For instance, if the Upper Sensor Limit is 80 kPa and the minimum span is 2 kPa, then the turndown ratio (rangeability) of that transmitter is 40:1.

What is rangeability?

Rangeability is the ratio of the smallest permitted span to the Upper Sensor Limit. For instance, if the Upper Sensor Limit is 80 kPa and the minimum span is 2 kPa, then the turndown ratio (rangeability) of that transmitter is 40:1.

What is zero elevation?

Zero elevation refers to the lower range value (4 mA point) that is below zero. For instance, the range of -25 to +100, or -100 to 0, or -100 to -20 can be classified as zero elevation.

What is zero suppression?

Zero suppression is the stage where the lower range value (4 mA point) is above zero. A range of 20 to 100 can be categorized as zero suppression.

What is a documenting calibrator?

A documenting calibrator is the combination of the functions of both the portable calibrator and a handheld field communicator. They are combined together to form a single tool. The portable documenting calibrator can digitally communicate with intelligent transmitters and automatically document the calibration performed.

Why are analog signal endpoints 4-20 mA and not other values?

The 4 mA and 20 mA endpoints evolved from 3-15 psi. 4mA for Live Zero Check and the Selected Range for Linearity.

What is paperless calibration?

A paperless calibration refers to a documenting calibrator that can automatically record the calibration performed. This eliminates the need for manual documentation.

What is a calibration route?

A calibration route is the order in which instruments are calibrated. It generally includes pieces of equipments in a specific area of the plant. In order to make it more efficient, the calibration route can be based on calibration due date or by instrument type.

How do I ‘zero’ a transmitter?

To set a transmitter at ‘zero’, use either the zero sensor trim or lower the range values.

What is NAMUR NE43?

NE43 is a recommendation from the NAMUR organization proposing standard analog signal levels for failure indication from transmitters with 4-20 mA output. If the current is below 3.6 mA or above 21 mA, the system interprets it as a sensor fault.

Find out: HART Protocol EST4300 Smart Pressure Transmitter in our Shop 

You may also interest in: