What is pressure sensor offset?

Imagine you have a pressure sensor connected to a gauge that is supposed to read 0 when there is no pressure. But even when nothing is connected, the gauge shows it as +1 psi.

This is called an pressure sensor offset error. It means the sensor is not perfectly calibrated and shows a small pressure reading when there should be none.

To give some real numbers, let’s look at specifications:

  • A low-cost sensor may have an offset of ±1% of the full-scale range.
  • So a sensor that reads up to 100 psi could show an offset between +1 psi to -1 psi.
  • A high-precision sensor reduces this to only ±0.1% FS or ±0.1 psi for the same 100 psi sensor.

Offsets can occur due to small differences in the sensor’s physical structure or electronic components. Engineers try to minimize them through calibration testing.

Let’s take an example,

Let’s say a sensor has an offset of +1 psi.

Then:

If the actual pressure is 0 psi, the sensor will read 1 psi instead of the correct 0 psi.

This is a full 1 psi error just because of the offset.

If the actual pressure is 5 psi, the sensor will read 6 psi instead of 5 psi. The error is now the 1 psi offset.

As pressure increases, the impact of a fixed offset error becomes smaller as a percentage. But the reading is still absolutely inaccurate.

For small pressure changes near zero, the offset could be a major source of inaccuracy, swamping the actual pressure variation.

The greater the offset value, the larger these errors will be. This is why industries like aerospace that require very high accuracy specify sensors with offsets smaller than +/- 0.1% of full scale.

What is the relationship between offset and sensitivity

All pressure sensors are sensitive to very small pressure changes. But they aren’t always perfect – some may have a small offset error where it reads a non-zero pressure when there is actually none.

Let’s look at two examples:

A low-cost sensor has a sensitivity of 1 PSI, meaning it can detect 1 PSI changes. Its offset is ±2 PSI.

A high-precision sensor has a greater sensitivity of 0.1 PSI. But its smaller offset is only ±0.5 PSI.

We can see here that higher sensitivity sensors, which can detect smaller pressure variations, typically have lower offset errors.

This is because making a sensor more sensitive, say by having a thinner diaphragm, also makes it more susceptible to unwanted stresses that can cause offsets.

So reducing a sensor’s offset requires design optimizations that may lower its maximum sensitivity as a trade-off.

How to reduce offset error?

Improved diaphragm flatness

Improving diaphragm flatness is very effective. If the diaphragm bows up in the middle even 1 micron, it causes offset. By polishing with precision machines accurate to 0.01 microns, offsets reduced from 1 PSI to 0.1 PSI.

The designer can follow below step to get the good flatness

    1. Careful single-crystal silicon wafer selection ensures as few defects/variations as possible.
    1. Advanced chemical mechanical polishing (CMP) processes precisely smooth surfaces to optimize flatness.
    1. Thinning diaphragms from the back side using etching allows maintaining the original stress-free flat front surface.
    1. Diaphragm thickness is tightly controlled via etch stop points to within 1-2 μm over the entire wafer.
    1. Smaller diameter diaphragms also see proportional reductions in potential profile irregularities.
    1. Flatness can be tested pre-assembly using optical or capacitive profilometry to sub-angstrom accuracy.

Thermal stabilization

Adding a small thermistor provides temperature data to a processor. It adjusts the sensor output up/down by 0.1 PSI depending on whether it’s 20°C or 30°C, cutting thermal drift offsets in half.

All parts in a sensor expand and contract slightly with temperature changes. This thermal effect alone can cause up to 2 PSI offset errors over a 100°C range in many sensors!

To prevent this, engineers add a tiny thermistor nearby the pressure sensing element. This thermistor acts like a thermal checker.

As temperature rises, the thermistor’s electrical resistance changes in a precise and known way based on its material properties.

An on-board microchip continuously monitors the thermistor and compares its reading to the chip’s internal temperature-resistance data.

If it detects a 1°C rise, it knows the sensor housing and components will expand just enough to induce 0.1 PSI of offset error. So it automatically subtracts 0.1 PSI from readings.

This digital thermal compensation holds offset steady within +/-0.05 PSI over the full operating range. Without it, drifting offsets would ruin measurement accuracy.

By intelligently accounting for temperature, sensors can self-correct for a major source of potential errors. This ensures reliable pressure readings regardless of ambient conditions.

Reduced vacuum pressure

Earlier, sensors held 0.5 PSI vacuum using one rubber seal which caused 4 PSI offsets over time. Switching to a dual O-ring viton seal maintained vacuum for over a year, lowering offsets to a stable 0.2 PSI.

Engineers also lowered die attachment stress by changing from heavy duty epoxy to super thin cyanoacrylate glue. This reduced residual mounting strain, cutting the offset from 1 PSI down to 0.3 PSI.

Digital calibration

Traditional sensors use trim potentiometers that may adjust offset by 0-5% of the sensor’s full scale range. This is like correcting a reading by up to 5 psi on a 100 psi sensor.

Modern sensors add an on-chip microchip and convert the analog output to a digital number out of 4095 counts. Factory calibration determines the reading should be 1010 at zero pressure.

If uncalibrated, the sensor delivers 1020 counts instead. The chip calculates it needs to subtract 10 counts from its results to trim offset to the proper 1010.

It does this digitally by programming resistor settings inside to lower the bridge output by exactly 10 units. Now offset is reduced from 1020-1010 to the desired 1010, an error of only 0.2% full scale.

Traditional sensors lacked this level of digital precision. By controlling offsets down to a granularity of 0.025% rather than 5%, much cleaner zero pressure readings result even over time and temperature changes.

A few takeaway points

  1. Response time refers to how quickly a sensor detects and reports pressure changes. It’s an important specification that depends on sensor design and materials.
  1. Applications with rapidly fluctuating pressures like engines require very fast sub-millisecond response times. Slower processes can tolerate response times of 10-100ms.
  1. Higher sensitivity generally means slower response times due to design tradeoffs. The right balance is needed for each application.
  1. Offset errors occur when sensors read a non-zero pressure at zero pressure applied. This directly impacts measurement accuracy.
  1. Improving factors like diaphragm flatness, thermal stabilization, calibration precision are used to minimize offset errors.
  1. Digital calibration and compensation techniques allow reducing offsets to levels far below what analog designs provide.
  1. Understanding response time and error specifications is critical for selecting a pressure sensor that can meet the measurement needs and tolerances of different industrial processes.
  1. With optimized designs and materials, sensor engineers can enhance performance to enable highly precise real-time dynamic pressure monitoring.