Recalibrating a Gauge Pressure Transmitter to be a vacuum pressure even if its not absolute type categorised?

J

Thread Starter

Justin

What is the difference between gauge pressure and absolute pressure? Can a gauge pressure transmitter be used for absolute pressure? If so, how can the two categories be discriminated from each other? I have an ABB Model 261GS gauge pressure transmitter. i need to recalibrate this transmitter from - 0.1 to 0.4 Bar.
 
All pressure measurements are inherently differential, every pressure measurment is compared to a reference pressure.

Absolute pressure references absolute zero (a hard vacuum)

Gauge pressure references the local atmospheric pressure

Differential pressure references whatever pressure its reference port is connected to.

Vacuum can be measured in absolute units referencing absolute zero, or in gauge units referenced to atmospheric pressure.

It is imperative that the letter A be used to indicate absolute pressure units, as in PSIA or Bara

The use of the letter G for gauge is optional, but whenever I deal with vacuum, I add the letter G to the pressure units so there is no question whether the vacuum is gauge or absolute.

>Can a gauge pressure transmitter be used for absolute pressure?

No. The reference points are vastly different. One can assume or measure a barometric atmospheric pressure (with an absolute measurement) and add that value to the gauge pressure measurement, but that's a Kluge.

>If so, how can the two categories be discriminated from each other?
Both gauge and absolute transmitters have only a single port, but the model number and (usually) nameplate indicates measurement units.

>I have an ABB Model 261GS gauge pressure transmitter. i need to recalibrate this transmitter from - 0.1 to 0.4 Bar.

The G in the model number is for Gauge, if it were an A it would be an absolute transmitter.

Most modern gauge pressure transmitters can be ranged for vacuum. If your units are gauge units (and they probably are) then -0.1 bar is vacuum, 0.4 bar is positive pressure.

Hook up a communicator and do it.
 
A

Asok Kumar Hait

For the gauge pressure transmitter both the ports are open to the atmosphere. If you are connecting the + side port of the transmitter to the vessel/pipe, then you are measuring the pressure in that vessel/pipe against the atmospheric pressure (acting on the - side port of the transmitter) 1 bar and that's why this is gauge pressure.
For absolute pressure transmitter this negative port will be sealed to near zero pressure.

You can easily calibrate a gauge pressure transmitter to - 0.1 barg to 0.4 barg. Just suppress the zero so that - 0.1 barg gives 4 mA and +0.4 barg gives 20 mA.

But this transmitter you can't calibrate for -0.1 bar(A) to + 0.4 bar (A).

Hope this helps
 
David,

In the manufacturer specs for the ABB 261GS, its mentioned the min span for the instrument is 1.5 bar. My range, being - 0.4 bar to 0.1 bar, has the span of 0.5 Bar? I know the transmitter would give a 4 to 20 if i re-range it using a HART,but would it do a good job?

And you had mentioned "Absolute pressure references absolute zero (a hard vacuum)"
In terms of unit BAR, how much value is this absolute zero /hard vacuum with respect to atmospheric pressure?

Suppose a transmitter tag said its calibrated from 0 BARA TO 1 BARA. While using a normal pressure pump (DRUCK/OMEGA) for calibration, i would get a 4~20 from the transmitter by applying -1 BAR TO 0 BAR (APPROX) using the pressure calibrating pump.

Am just trying to draw a conclusive relation between your definition and my observation. Is this pressure difference anomaly occurring since the pump (druck) sensor was calibrated based on gauge/atmospheric pressure?

And one last question, you had mentioned,
"Most modern gauge pressure transmitters can be ranged for vacuum. If your units are gauge units (and they probably are) then -0.1 bar is vacuum, 0.4 bar is positive pressure."

What if the units had been absolute and not gauge?

Thanks a lot.
 
Your stated range has changed from -0.1 to 0.4 bar (in the first post) to -0.4 to 0.1 bar. Make sure you know which range you're dealing with.

>In the manufacturer specs for the ABB 261GS, its mentioned the min span for the instrument is 1.5 bar. My range, being - 0.4 bar to 0.1 bar, has the span of 0.5 Bar?

You are correct, your -0.4 to 0.1 (or -0.1 to 0.4) is span of 0.5 bar.

>the min span for the instrument is 1.5 bar.

If the transmitter has a minimum span spec, it has a minimum span spec. But what does that mean? I'm not sure whether the transmitter will balk at accepting range less than its minimum or whether ABB will just claim that a 'short span' won't meet its accuracy spec.

Just extend the range to -0.4 to 1.1 bar to get a 1.5 bar span and use 30 % (0.5/1.5) of the available current signal, from 4.0mA to 9.33 mA at the HMI/PLC/DCS/whatever.

>I know the transmitter would give a 4 to 20 if i re-range it using a HART,but would it do a good job?

The transmitter always has a 4-20mA output, the question is, what does the 4-20mA represent?

Will it do a good job? It depends. You already have the transmitter, connect it, configure it and use it and see. If the resolution is not sufficient, you'll need to get a different model.
A bird in the hand is worth two in the bush.

>And you had mentioned "Absolute pressure references absolute zero (a hard vacuum)"

In terms of unit BAR, how much value is this absolute zero /hard vacuum with respect to atmospheric pressure?

It depends on the barometric pressure at the moment. What the weatherman reports as the barometric pressure is the pressure above absolute zero exerted by the weight of the atmosphere. Barometric pressure is an absolute pressure value.

>Suppose a transmitter tag said its calibrated from 0 BARA TO 1 BARA. While using a normal pressure pump (DRUCK/OMEGA) for calibration, i would get a 4~20 from the transmitter by applying -1 BAR TO 0 BAR (APPROX) using the pressure calibrating pump.

Correct, assuming you mean "applying -1 BARg [same as 0 BARA] to 0 BARg [close to 1 BARA]", but I doubt your pump will get to -1 BARg [0 BARA] because it's tough to pull a hard vacuum.

>Am just trying to draw a conclusive relation between your definition and my observation. Is this pressure difference anomaly occurring since the pump (druck) sensor was calibrated based on gauge/atmospheric pressure?

I'm not sure what 'pressure difference anomaly' you are referring to.

>And one last question, you had mentioned,
"Most modern gauge pressure transmitters can be ranged for vacuum. If your units are gauge units (and they probably are) then -0.1 bar is vacuum, 0.4 bar is positive pressure."
>What if the units had been absolute and not gauge?

If you can achieve -0.1 BARA you'd probably win the Nobel prize in physics, because that's less than absolute zero.
 
R
David,

That's a very good explanation of the difference between Absolute and gauge.

We use Absolute pressure transmitters on evaporators.

Roy
 
Top