Please, I wonder if someone knows some procedure to calibrate a transmitter of 4-20 mA, 24 V, thanks from Argentina.
Modern industrial transmitters are calibrated to a traceable national standard at the factory to an accuracy that is better than 99 out of 100 shops' ability to duplicate with field calibrators. So you shouldn't need to calibrate a new transmitter.
Re-ranging what the 4-20mA output represents is not 'calibration', it is 'configuration'. Many transmitters are 'smart' transmitters can be re-ranged in the field. Dumb transmitters cannot be re-ranged, but can be tweaked slightly, which is really a cal procedure or drift adjustment.
Is your transmitter smart or dumb?
Some smart transmitters, like the Siemens DS3 series, have built-in pushbuttons and a digital display which are used to range the transmitter (with our without a 'source'). Does yours?
Or, HART is a digital protocol designed for configuring transmitters.
Most 'smart' transmitters are HART compatible which requires the use of either a handheld HART communicator or Windows software and a HART modem to access the configuration parameters. Configuring a transmitter with HART requires that the transmitter be powered up, and does not require a source.
'Dumb' transmitters require some source (pressure for pressure, signal source for temperature to properly tweak the pot adjustments. Does yours have pots with screwdriver adjustments?
I have a Rosemount smart transmitter. I want to know the procedure how to change the upper range value without using the HART.
I don't think you can. I think you need either HART configuration software (available from several manufacturers) or a HART communicator. I went and looked at a Rosemount pressure transmitter manual just to be sure, and you CAN adjust the zero, but not, apparently, the upper range value.
Editor in Chief
blog:Sound OFF!! http://waltboyes.livejournal.com
Putman Media Inc.
555 W. Pierce Rd. Suite 301
Itasca, IL 60143
It is from a 3051 Rosemount manual.
Rerange with a pressure input source and the local zero and span buttons: (without Hart Comunicator)
1. Loosen the screw holding the certification label on top of the transmitter housing, and rotate the label to expose the zero and span buttons.
2. Using a pressure source with an accuracy three to ten times the desired calibrated accuracy, apply a pressure equivalent to the lower range value to the high side of the transmitter.
3. To set the 4mA point, press and hold the zero buttons for at least two seconds, then verify that the output is 4mA.
*(Very important--it is not a zero trim, for zero trim you will need use a Hart Comunicator.)*
4. Apply a pressure equivalent to the upper range value to the high side of the transmitter.
5. To set the 20mA point, press and hold the span button for at least two seconds, then verify that the output is 20mA.
The complete manual and information is available on the Rosemount website.
You are missing a very important aspect. The 4-20 ma is merely how the transmitter "represents" some measure physical property to a control system. What is it measuring?!
0-1000PSI, or maybe 0-50 Liters/min, or 100-500 Deg Celsius?
The short answer is simulate the measure physical property at 0%, 25%, 50%, 75%, 100%, 75%, 50%, 25%, then 0% (9 points total) on the transmitters sensing input, and read that you have 4ma, 8ma, 12ma, 16ma, 20ma, 16ma, 12ma, 8ma, then 4ma on the output loop. The 24VDC is pretty much immaterial.
Now, having said that, keep in mind that I don't know your process at all, so what I told you might not make any sense for your application.
Michael R. Batchelor
GUERRILLA MAINTENANCE [TM] PLC Training
5 Day Hands on PLC Boot Camp for Allen Bradley
PLC-5, SLC-500, and ControlLogix
Industrial Informatics, Inc.
1013 Bankton Cir., Suite C
Hanahan, SC 29406
843-329-0342 x111 Voice
Thanks Mr. Batchelor, that's is a good point to bear in mind. I'm just starting with control instruments and I apreciate your information.
We are starting with a water injection system to the turbines combustion NOx reduction.
I concur with the short answer with one caution. If there is a specific cut-off point set up in the transmitter (usually at the very bottom end depending on the process, i.e. flow to reduce a totalized inaccuracy).
I would suggest the lower (0% process simulation) be increased to just above the cut-off point. This will improve the accuracy (by ensuring the straight line curve of the process to instrument output is maintained).
You should have a SPEC SHEET made up by the engineer showing you all the calibration data required to configure and calibrate the transmitter.
You will be measuring water flow, so it will be either pounds per hour or gallons per hr or min. That should be shown on spec sheet and a upper and lower range value specified in inches of water.
Depending on your calibration method, using a manometer of proper range or a pnuematic dead weight certified in inches of water you then apply the test signal to the transmitter being calibrated in a 9 point calibration curve, that being 0% flow, 25%, 50%, 75%, 100%, 75%, 50%, 25% and back to 0% flow. (Note 0% flow is usually at 3 PSIG or 4 MA transmitter output.) THIS IS STANDARD PROCEDURE FOR A USA CALIBRATION. THE VALUES MAY BE DIFFERENT ELSEWHERE. NOTE - MAKE SURE CONTROLLER IS IN MANUAL BEFORE STARTING THE CALIBRATION
I'm not sure about the 9 point calibration.
What's the point if you only have a Zero and Span adjustment. It may be "Standard Procedure" but I have never seen it done that way. Perhaps in the factory.
In reply to Roy Matson: There are two sides to calibration. There is adjusting the calibration, and there is checking the calibration. When you are adjusting the calibration, then typically all you have is zero and span (or offset and gain). When you are checking the calibration however, you aren't adjusting anything, so you can take multiple readings.
The reason for checking at intermediate points is to look for non-linearities in the readings. This can indicate a problem with the instrument itself, or it might be due to incorrect installation.
The most common example I can think of is when replacing an LVDT. If you don't have it centered correctly, you could be operating in the non-linear region at either end. If you just check it at two points, you won't see the error because it will be *correct* at those two points. Everywhere else however, it will be wrong. This is actually quite a common problem, as LVDTs are usually installed with just a pinch clamp. The only way to detect this is to check it at multiple points.
For other types of instruments the linearity check may turn up different problems. In some cases all you can do is replace the instrument (or signal conditioner, etc.).
Normally I recommend setting a tolerance to the calibration, and if the readings are not outside of the tolerance then don't adjust it (unless it is an initial installation). Most of the instruments that I have dealt with don't seem to drift, so a change usually indicates damage or fatigue (or an incorrect initial installation).
Coming back to the original posting by David, re-ranging vs. calibration is explained in the technical white paper on this page: