advertisement
from the calibration department...
Transmitter calibration
Engineering and workplace issues. topic
Posted by Anonymous on 3 February, 2007 - 12:46 pm
Hi all!!!!

Please, I wonder if someone knows some procedure to calibrate a transmitter of 4-20 mA, 24 V, thanks from Argentina.


Posted by David on 3 February, 2007 - 9:17 pm
Modern industrial transmitters are calibrated to a traceable national standard at the factory to an accuracy that is better than 99 out of 100 shops' ability to duplicate with field calibrators. So you shouldn't need to calibrate a new transmitter.

Re-ranging what the 4-20mA output represents is not 'calibration', it is 'configuration'. Many transmitters are 'smart' transmitters can be re-ranged in the field. Dumb transmitters cannot be re-ranged, but can be tweaked slightly, which is really a cal procedure or drift adjustment.

Is your transmitter smart or dumb?

Some smart transmitters, like the Siemens DS3 series, have built-in pushbuttons and a digital display which are used to range the transmitter (with our without a 'source'). Does yours?

Or, HART is a digital protocol designed for configuring transmitters.

Most 'smart' transmitters are HART compatible which requires the use of either a handheld HART communicator or Windows software and a HART modem to access the configuration parameters. Configuring a transmitter with HART requires that the transmitter be powered up, and does not require a source.

'Dumb' transmitters require some source (pressure for pressure, signal source for temperature to properly tweak the pot adjustments. Does yours have pots with screwdriver adjustments?

David


Posted by kumar on 2 March, 2007 - 11:13 pm
Dear all,

I have a Rosemount smart transmitter. I want to know the procedure how to change the upper range value without using the HART.

kumar


Posted by Walt Boyes on 4 March, 2007 - 1:17 pm
I don't think you can. I think you need either HART configuration software (available from several manufacturers) or a HART communicator. I went and looked at a Rosemount pressure transmitter manual just to be sure, and you CAN adjust the zero, but not, apparently, the upper range value.

Walt Boyes
Editor in Chief
Control magazine
www.controlglobal.com
blog:Sound OFF!! http://waltboyes.livejournal.com
_________________

Putman Media Inc.
555 W. Pierce Rd. Suite 301
Itasca, IL 60143
630-467-1301 x368
wboyes@putman.net


Posted by Juan Pinzon on 6 March, 2007 - 10:09 pm
Dear Kumar,

It is from a 3051 Rosemount manual.

Rerange with a pressure input source and the local zero and span buttons: (without Hart Comunicator)

1. Loosen the screw holding the certification label on top of the transmitter housing, and rotate the label to expose the zero and span buttons.

2. Using a pressure source with an accuracy three to ten times the desired calibrated accuracy, apply a pressure equivalent to the lower range value to the high side of the transmitter.

3. To set the 4mA point, press and hold the zero buttons for at least two seconds, then verify that the output is 4mA.

*(Very important--it is not a zero trim, for zero trim you will need use a Hart Comunicator.)*

Then,

4. Apply a pressure equivalent to the upper range value to the high side of the transmitter.

5. To set the 20mA point, press and hold the span button for at least two seconds, then verify that the output is 20mA.

The complete manual and information is available on the Rosemount website.

Regards,
J.Pinzon
Instruments Technician


Posted by Michael Batchelor on 3 February, 2007 - 9:22 pm
You are missing a very important aspect. The 4-20 ma is merely how the transmitter "represents" some measure physical property to a control system. What is it measuring?!

0-1000PSI, or maybe 0-50 Liters/min, or 100-500 Deg Celsius?

The short answer is simulate the measure physical property at 0%, 25%, 50%, 75%, 100%, 75%, 50%, 25%, then 0% (9 points total) on the transmitters sensing input, and read that you have 4ma, 8ma, 12ma, 16ma, 20ma, 16ma, 12ma, 8ma, then 4ma on the output loop. The 24VDC is pretty much immaterial.

Now, having said that, keep in mind that I don't know your process at all, so what I told you might not make any sense for your application.

Michael R. Batchelor
www.ind-info.com

GUERRILLA MAINTENANCE [TM] PLC Training
5 Day Hands on PLC Boot Camp for Allen Bradley
PLC-5, SLC-500, and ControlLogix
www.ind-info.com/schedule.html
training@ind-info.com

Industrial Informatics, Inc.
1013 Bankton Cir., Suite C
Hanahan, SC 29406

843-329-0342 x111 Voice
843-412-2692 Cell
843-329-0343 FAX


Posted by walterik on 6 February, 2007 - 12:09 am
Thanks Mr. Batchelor, that's is a good point to bear in mind. I'm just starting with control instruments and I apreciate your information.
We are starting with a water injection system to the turbines combustion NOx reduction.


Posted by denn on 6 February, 2007 - 12:24 am
Michael,

I concur with the short answer with one caution. If there is a specific cut-off point set up in the transmitter (usually at the very bottom end depending on the process, i.e. flow to reduce a totalized inaccuracy).

I would suggest the lower (0% process simulation) be increased to just above the cut-off point. This will improve the accuracy (by ensuring the straight line curve of the process to instrument output is maintained).

Dennis


Posted by Glenn Powell on 8 October, 2009 - 4:22 am
Michael,

You should have a SPEC SHEET made up by the engineer showing you all the calibration data required to configure and calibrate the transmitter.

You will be measuring water flow, so it will be either pounds per hour or gallons per hr or min. That should be shown on spec sheet and a upper and lower range value specified in inches of water.

Depending on your calibration method, using a manometer of proper range or a pnuematic dead weight certified in inches of water you then apply the test signal to the transmitter being calibrated in a 9 point calibration curve, that being 0% flow, 25%, 50%, 75%, 100%, 75%, 50%, 25% and back to 0% flow. (Note 0% flow is usually at 3 PSIG or 4 MA transmitter output.) THIS IS STANDARD PROCEDURE FOR A USA CALIBRATION. THE VALUES MAY BE DIFFERENT ELSEWHERE. NOTE - MAKE SURE CONTROLLER IS IN MANUAL BEFORE STARTING THE CALIBRATION


Posted by Roy Matson on 9 October, 2009 - 6:49 pm
Glenn,

I'm not sure about the 9 point calibration.

What's the point if you only have a Zero and Span adjustment. It may be "Standard Procedure" but I have never seen it done that way. Perhaps in the factory.

Roy


Posted by M Griffin on 10 October, 2009 - 2:39 pm
In reply to Roy Matson: There are two sides to calibration. There is adjusting the calibration, and there is checking the calibration. When you are adjusting the calibration, then typically all you have is zero and span (or offset and gain). When you are checking the calibration however, you aren't adjusting anything, so you can take multiple readings.

The reason for checking at intermediate points is to look for non-linearities in the readings. This can indicate a problem with the instrument itself, or it might be due to incorrect installation.

The most common example I can think of is when replacing an LVDT. If you don't have it centered correctly, you could be operating in the non-linear region at either end. If you just check it at two points, you won't see the error because it will be *correct* at those two points. Everywhere else however, it will be wrong. This is actually quite a common problem, as LVDTs are usually installed with just a pinch clamp. The only way to detect this is to check it at multiple points.

For other types of instruments the linearity check may turn up different problems. In some cases all you can do is replace the instrument (or signal conditioner, etc.).

Normally I recommend setting a tolerance to the calibration, and if the readings are not outside of the tolerance then don't adjust it (unless it is an initial installation). Most of the instruments that I have dealt with don't seem to drift, so a change usually indicates damage or fatigue (or an incorrect initial installation).


Posted by Jonas Berge on 3 February, 2010 - 11:31 am
Coming back to the original posting by David, re-ranging vs. calibration is explained in the technical white paper on this page:
http://www.eddl.org/DeviceManagement/Pages/Calibration.aspx

Cheers,
Jonas


Posted by Denn2602 on 4 February, 2010 - 12:31 pm
Dear Mr.Batchelor

As far as I known, it is not always 9 points as you said.
The total of points which we need to calibrate depend on the accuracy of the dut (device under test). The higher accuracy transmitter has, the more point we must calibrate in its range.

Your use of this site is subject to the terms and conditions set forth under Legal Notices and the Privacy Policy. Please read those terms and conditions carefully. Subject to the rights expressly reserved to others under Legal Notices, the content of this site and the compilation thereof is © 1999-2014 Nerds in Control, LLC. All rights reserved.

Users of this site are benefiting from open source technologies, including PHP, MySQL and Apache. Be happy.


Fortune
"He flung himself on his horse and rode madly off in all directions"
Advertise here
Advertisement
our advertisers
Help keep our servers running...
Patronize our advertisers!
Visit our Post Archive