PX_VIB_GAIN in Mark V

A

Thread Starter

anonymous

Dear all,

how is the PX_VIB_GAIN been set in Mark V for vibration calculation, we always having problem vibration reading is different between Bently & Mark V because of this setting is varies between 1.014 to 1.446
 
I've searched several heavy duty gas turbine jobs I have and on LM application, and I can't find this signal name.

Can you please be more specific about the application (gas or steam; heavy duty or aero; etc.) and what the signal does? Is it in I/O Configuration or a Control Constant in the CSP?

I'm going to guess it has something to do with a proximitor vibration pick-up gain. But that's just a guess.
 
Dear Sir,
ours is 500MW Syeam Turbine,

it's under DIAGC Data Display TCQB#B Diagnostics
proximeters 1-18 Vibration inputs
i.e

FPKD FVIB_PTP FVIB_GAIN PX_VIB_GAIN PX_VIB
PX1 = -10 241 1.036 1.453 -0.1

but this is value during off bar I try to get other value during unit running

FPKD is the input signal from Bently and PX_VIB is the end value that reads by Mark V for as BB1X. I observed during low load i.e from sync to 300MW the value PX_VIB_GAIN varies between 1.01 to 1.45 until it settle to 1.01 and you can see the Bently reading is different with Mark V value

thanks in advance
 
It's been said before on control.com: The data from DIAGC is suspect since there are a very limited number of people who can say for certain that DIAGC.DAT is exactly correct for the versions of cards and PROMs installed in the panel. It was very common for DIAGC.DAT not to be properly updated if PROMs were ever changed/upgraded in the field. It was also common for incorrect DIAGC.DAT files to be shipped to site with the original software and never correctly updated.

Having said that, I'm not familiar with the signal you are asking about. Since proximitor feedback is a function of speed I wonder if that number is something that is being calculated based on speed changes (which might be very small but perceptible during load changes). How steady is the frequency of the grid to which you are connected?

Have you plotted (using VIEW2) turbine speed and load and the proximitor inputs during loading and unloading?

I have seen a lot of differences between readings on a B-N monitor/rack and those on Speedtronic turbine control panels. Sometimes that can be attributed to PROM versions and sometimes it can be attributed to configuration settings, but in my experience it's very difficult to get the indications of the two systems to agree--even when they are using the same proximitor inputs!

Sometimes the differences are small and "negligible"; sometimes they are very large and almost disturbing. But, differences are not uncommon.

Is this difference you are experiencing only on one proximitor input, or just the largest on this particular input, or do all the proximitor inputs have about the same difference?

I hope someone else here on control.com has seen this problem and can provide some help. My experience has been that the differences were only narrowed with PROM changes. And a lot of time and energy was expended in trying to get to the cause of the differences; a lot of data was taken using an ADRE system over a couple of weeks.
 
CSA, thanks for the feedback, I will plot the graph and analyze the trend. I just need to know how they set that gain, what is the input, calculation or F(x) if any. GE did mention in Mark V checkout procedure for vibration probe that we need to match the mark V and wobulator (KE3) speed when do the probe calibration if not there is error between but no further information given.

I already call GE TA in but he also not very sure about that setting. I also call Bently MDS engineer to setup the ADRE cause right now, its fluctuate between 0.19 to 0.21mm our alarm is set at 0.2mm and trip at o.25mm and as I mention earlier BN local reading is only 0.14 to 0.16mm. All the vib reading have error but right know bearing 4 is highest.

thanks again
 
Hi,

just a quick post and I'll check when back in the ofice on Monday, but there is the possibility to install bandpass filters on the BN and sensonics systems. It is quite feasible that the filters are set at different frequencies and with different rolls off's so as to create a different indication.
 
Perhaps GE has set the gain to vary with shaft rotative speed to create a sort-of parametric alarming scheme.

Details...

I worked for Bently Nevada for 20+ years and can confirm that the passband for proximity probes would be plenty wide (usually 4Hz to 4kHz) in 3300, 3500, and most other BN systems to ensure that the response would be flat over the turbine's operational speeds from start up through 3000 rpm or 3600 rpm (depending on whether 50Hz trains or 60 Hz trains). Default low-pass filter corners for proximity probe-based radial vibration channels are almost always 4kHz (240,000 rpm) to allow the fundamental and plenty of harmonics to be seen by the monitor. It is thus very unlikely that the discrepancy you are seeing between the Mark V and the BN monitors is related to filter corner roll-offs (at least, not in the BN monitors). I cannot speak for the P-Vibe and V-Vibe cards used in the Mark-series controllers, but I suspect they also have very flat passbands over the turbine running speeds.

Instead, what appears to be going on as my colleagues and I read through this thread and surmised possible influences, is that GE's algorithm is perhaps adjusting the gain based on the turbine's shaft rotative speed. This would fit with their comment that the wobulator (KE3) and Mark V speed must match or else there will be discrepancies. This *may* be because GE has done calcs that show the allowable vibration amplitude varies with speed (since force varies with the square of speed) and this might be some kind of work-around they are using to achieve parametric alarming (adjusting the gain with speed so that the "effective" alarm level is changing with speed). Newer control systems such as the Mark VI and Mark VIe probably allow the alarm setpoint(s) to be adjusted directly as a function of speed, without messing with the gain (which will normally be 200 mV/mil for a prox probe), so it could be that these gain changes do not appear in newer vintages of GE controls.

We are curious to know what you find out, so please post an answer once you have ascertained root cause.

Hope these observations help if you have not already solved.

DISCLAIMER: The above is simply speculation, but it does seem to fit with the comment about the gain varying between 1.014 and 1.446. Perhaps you can clarify by providing details of the gain changes (if any) at various speeds while the turbine is ramping.
 
I was on-site and had a look at this. The gain change was NOT a function of speed as the machine was synchronized and operated at a constant speed. Between the various people that I worked with we were thinking of two possible scenarios:

1. Gain is set based on frequency content in the overall vibration signal OR

2. GE is somehow trying to simulate Smax in this measurement.

Please ask around to your (ex) GE colleagues because GE is not responding to any inquiries in this matter.
 
Top