I know I will probally get yelled at again but here it goes anyway.
We are working for a city where we have added some new control to some Diesel Generators. Right now we are working in stictly AVR mode with 15% voltage droop setpoint. We are still have some MVAr problems at the tie line into the grid. I think I need to implement PF or VAr control - probally PF mode just not sure what a setpoint would be. The system has transformer taps and capacitor banks in which the operators use to try maintain the bus voltage. Whenever I have my generator online the have to tap 6 times and throw some capacitors online. Which they say the don't have to do with their manual generators.
I probally have not given enough information about the city's grid. These are low RPM generators @ 4.45 MW, they are used in "island" and "infinite" grids. Please don't complain about the term "infinite" or "island". I guess if I used PF mode which would vary with the KW load versus the VAr mode which will not. Do I want a -KVAr reading at the generator or a +KVAr reading - or try to maintain Unitiy and have a 0KVAr reading and would that help at the Tie Line where the are Reading MVAr. Thanks for your help it is appreciated.
You should be able to control VAr's and pf (the two are related) using the AVR (Automatic Voltage Regulator) adjustment for Raise- and Lower Volts. (This is presuming there is no other automatic control that is active at the time.) The "volts" you are raising and lowering are the generator terminal volts.
When generator terminal volts are equal to the grid volts then the VAr reading will be zero, and the power factor will be unity (1.0). (We're talking about "across" the breaker that is closed to synchronize the generator to the grid. There is usually a set of PTs (Potential Transformer) on the generator side of the breaker and on on the line (grid) side of the breaker.)
If the generator terminal voltage is lower than the grid voltage then the VAr reading will increase in the Leading direction, and the power factor will decrease (be less than 1.0) in the Leading direction. Leading VAr and pf are usually considered to be negative.
If the generator terminal voltage is greater than the grid voltage the the VAr reading will increase in the Lagging direction, and the power factor will decrease (be less than 1.0) in the Lagging direction. Lagging VAr and pf are usually considered to be positive.
Now, the generator terminal voltage can't be much greater or less than the grid voltage to which it is connected (except on some "soft" systems) and so one doesn't generally see a difference in system or terminal voltage when making manual adjustments to the AVR.
The generator manufacturer provides a reactive capability curve which defines the limits of operation for the generator, for real power (watts), and for reactive power (VAr's, Leading and Lagging).
Someone in your organization should be telling you what the desired setpoint for VAr or pf operation should be depending on how the units are being operated (in "island" mode or when paralleled to the grid with other generators).
All that VAr- or pf control do is automatically monitor VAr's or pf and adjust the generator terminal voltage to make the actual VAr's or pf equal to the setpoint.
This is all pretty basic power generation stuff, and you should be able to find lots of information on it in your local public library or on the Internet. One great place to look on the Internet is canteach.candu.org. There is a lot of really excellent basic information there in a very understandable and logical presentation.
Thanks for the reply - I understand everything you have said. I know somebody for the city should know how they want to run this system, but they don't. I have done lots and lots of reading on the internet found a lot of good technical documents, but none of them really explain the problem that I am having.
My Generator is always running about 50-100 volts higher then the Bus. I have tried messing with the Voltage setpoint with a PID I tuned outside the AVR controller - I also have tried matching Bus voltage and have the same problem. I have also tried messing with the Voltage Droop setpoint which is currently at 15%. The only thing I have not tried is PF/VAr control but the problem with that is what is a setpoint and should the setpoint vary with what is happening is their system. Thanks again for your help.
Is this "control system" controlling the diesel *and* the AVR? Or, is it just sending a signal to the AVR (exciter regulator control) to lower the generator terminal voltage setpoint?
Is there some kind of indication on the AVR that is has reached its minimum limit? (One would hope so!)
If so, then someone needs to contact the generator manufacturer/vendor to find out what the minimum limit should be, compare it to the programmed or configured min limit currently running in the machine, and then make a decision about how to proceed.
If the current min limit is too high, well then, the decision is simple: lower it.
If the current min limit is correct per the generator manufacturer/vendor, then someone has to tell the City they bought a system that won't go any lower. Some transformers have tap changers and they might could be used to change the voltage at the generator terminals which would help the problem.
I believe you said this is a retrofit control system. If so, how were the units being operated before? (Someone with the City should be able to tell you that, and I'll bet they're telling you they didn't have any problems like this before, which may or may not be true. Most owners/operators try to get retrofit control system providers to fix all past evil by saying, "It worked just fine--until you put that new control system in!") Is there any data to support their claim that the units were operated at low VAr levels, near unity power factor?
Is the excitation system new, as well? It could be a problem with the application of the excitation system; it just can't be operated to provide the desired or required generator terminal voltage. There might be some remedy for that, but it would likely involve some "wasteful" resistors or a different rectifier bridge, or maybe the transformer tap changer (if it exists) could be changed.
A VAr or pf setpoint is just a condition that the operator either decides to run the unit(s) at or that the operator is contractually required to run the unit(s) at. (Yes; it can be specified in a purchased power agreement what VAr setting or pf the unit is to be operated, and even different settings can be specified for different times of the year and even different times of the day!)
The power factor is nothing more than a measure of the "efficiency" of the energy being input to a generator (torque and excitation, because it takes power of one sort or another to produce field current and generator terminal voltage). A pf of 1.0 (unity) means that all the energy going into the generator is going "out" as real power: watts, KW, MW. A power factor of 0.85 means that 85% of the energy going into the generator is going out as real power (again, Watts, KW, MW); the remainder is going "out" (or "in", depending on the whether the pf is lagging or leading, respectively) as reactive power.
Power producers don't generally get paid for "producing" VAr's; they get paid for producing watts, KW, MW. One doesn't see a VAr-hour meter very often (but they do exist). So, most power producers, left to their own decisions, will chose to produce power at a unity power factor, 1.0, or, to "produce" zero VAr's. Why reduce the amount of watts, KW, MW by increasing the VAr's and reducing the pf if not contractually obligated to do so or if there's no fincancial incentive to do so?
A VAr or pf setpoint could be anywhere "within" the reactive capability curve of the generator, but, again, without any financial incentive to "produce" VAr's, why do it? Why not operate at zero VAr's, unity pf (1.0)?
Most power plants without any contractual obligation to "produce" VAr's will operate their generators at a pf of anywhere from 0.95 Lagging to 1.0, a a very few Lagging VAr's. This is to try to prevent operation in the Leading direction, which is generally undesirable for several reasons, but usually because most generators (see below) just aren't built to be operated at low Leading power factors.
So, if you're looking for a setpoint, choose any one that keeps Lagging VAr's at or near zero and the power factor near unity (1.0) or slightly Lagging. See if the "control system" can handle that and let us know.
Most generator reactive capability curves will show that most generators are built to and can be operated at Lagging power factors well in excess of Leading power factor. Meaning, that they are capable of being "over-excited" more than they are capable of being "under-excited". One reason for this is that if the excitation is reduced too much, the generator field strength will decrease to the point that the generator rotor will fall out of synchronization with the grid, or fall out of step, or, "slip a pole". Slipping a pole usually results in catastrophic failure of the generator rotor, the coupling, and sometimes the prime mover rotor.
Another reason that most generators can't be operated at really low leading power factors is that there is heat developed in the generator which can cause lots of problems and eventual failure of insulation, which can also lead to catastrophic failure. (Over-excitation can cause similar heat-related failures, as can over load (producing too many watts, KW, MW).)
The AVR lower limit is supposed to be set to prevent damage either due to heating or slipping a pole when the generator is being operated at low excitation level(s).
So, it would appear that there is either a mis-programmed lower excitation limit, or a mis-configured input (incorrect scaling) that is adversely affecting the lower excitation limit, or the transformer tap changer isn't set correctly, or the generators are mis-applied (don't have the proper range of terminal voltage for the site).
From what you've described, and from what I think we "learned" from the previous thread(s), it would appear that there is some kind of limit which is preventing the AVR (exciter regulator) from reducing the excitation which would reduce the terminal voltage which would increase the power factor and reduce the VAr's. It's either a limit on the AVR or the inputs to the AVR aren't programmed/scaled/configured properly. Or, the site conditions are such that the generators can't be operated at a low enough terminal voltage.
I don't think any PID controller or VAr/pf control is going to fix any of the above.
Lastly, 15% droop for an exciter regulator seems a little excessive, but you say you've tried changing that with no luck. Have you confirmed this value with the AVR manufacturer/vendor?
Your frustration is fairly understandable. You are probably reasonably good with A-B controls and you've been asked to "fix" an undesirable situation without really understanding a lot about power generation. You've spent a good deal of time researching and studying and applying control system fixes for something that was either mis-programmed, mis-scaled, or mis-applied.
Happens all the time: Control system technicians being asked to put band-aids on poor programming and questionable applications. We're trying to help, but we don't know enough about the situation or the site conditions.
Chin up, buddy! Please, let us know how this thing progresses, and, please, provide the previous threads where you were "yelled at" so we can have some more background on this issue. I think I recall at least one, but I'm not sure.
But, you've got a problem, and as a technician you probably feel you should be able to solve it. But, mis-applications happen, like shite happens (you gotta love the Irish!). And while you might be able to come up with a band-aid, that's all it might ever be. And it will confound the heck out of someone who tries to understand it later!
Yeah my previous thread was about the "Reverse VAr Fault" condition that I was getting with this AVR controller. The AVR controller is the AB 1407-CGCM unit. I have control over the fuel racks which does not have a typical "Governor" on it, We have put AB Servos with linear actuators to duplicate it. So we have came a long way in controlling this engine - it actually gives the city a lot more flexibility in the engine for them to just put in a KW setpoint and hit the Start button on the HMI, I take care of starting the engine, sync, and then automatic loading and unloading the generator.
So from my other thread about the Reverse KVAr fault, I believe I am seeing to much -KVars and unit is tripping out - which with a RKVAr fault the CGCM removes excitation and all hell breaks loose at that time. There is a setting in the CGCM which is the UEL section - I currently do not have that enabled - which I know I need to have it enabled.
I am pretty sure I have all the PT and CT ratios in the CGCM set up correctly - we have compared that to the actual voltage coming from the PT for the Generator and Bus side to verify that.
The CGCM does provide the PLC with all the metering - and that is where I can see the Generator Voltage is 50-100 volts higher then the bus - the KVAr reading is -50 to -220 KVAr - and the PF reading is -1 to -.97 or so. I will be on site today trying some other stuff.
Thanks for your replies - I do appreciate it.
I thought so (about the RVAR thing).
Is this a new exciter system, or just a new controller on an existing exciter? Because if it's the latter, then it's likely a configuration issue; the limits of exciter operation are not properly defined. If it's an exciter controls retrofit (and not a new exciter with a new control system, too) , then it's likely that something is amiss with the configuration or the set-up.
I find it very hard to believe that if the generator voltage is 50-100 volts higher than the bus voltage that there is negative VAr's. That just doesn't make sense. Over-excitation (more than the amount required to make generator terminal voltage equal to bus voltage at the generator terminals) results in Lagging (positive) VAr's, a Lagging power factor. This is at the generator terminals, with respect to the generator. I believe you had mentioned something about another point of reference, in some switchyard. In that case, the point of reference changes.
I have seen problems with PT ratios, someone assumes the generator PTs are the same ratio as the bus PTs, when they are not.
Have the polarity of the CTs been verified? Have the CT ratios been verified, because I've seen the same thing with people assuming that the generator- and bus CTs have the same ratios.
Can you obtain some kind of independent reading of VAr's? Or even just reactive current? There are some very good portable power system analyzers which can help with determining exactly what's happening.
Because high generator volts and reverse VAr's don't go together.
Another possibility is the CGCM isn't working properly. Stranger things have happened. A-B should be able to tell you if there have been reported problems or known issues with that version of the controller from the nameplate and PROM information.
Yeah this is totally new installation. We have had AB down at the site 3 times, and everytime we walk away with nothing. They have 1 person in the entire company that knows anything about this module and he has no application experience with it.
Today on site I tried moving the voltage droop setpoint from a positive 15% to a negative 15% and I did not see anything change in the system. So I am more confused now then before. You are right; everything I have been reading on this subject is not going with what I am seeing in the field. Today again the generator voltage was higher the the bus.
We have found a guy that is going to come to our customers site that is supposed to be very good at power generation, hopefully he can shed some light on the subject with him on site.
The end result is we need the MVAr reading at their sub station going out to the tie line to read 0. They get charged for VArs; so the operator is constantly changing the taps and capacitors throughout the day to make that happen. If they are running any of the generators it does not have to equal 0 VARs at the generator terminals. If they need to push out some or take some in that is fine - as long as the sub reads 0.
Thanks again for your help
I think we're making progress here.
The "operator" is constantly changing transformer taps and capacitors....
The point of reference is not at the generators, it's at the substation.
You say they get charged for VAr's. Do they get charged for "supplying" VAr's to the grid, for "consuming" VAr's from the grid, or for both? I guess that's kind of a "dumb question" since it's from the substation; sorry.
I think there's STILL a problem with the exciter rectifier output--if you can't get your generator terminal volts to equal bus volts prior to synchronization or after synchronization that's a problem. Whether that's a problem with the CGCM or the "device" that's controlling the output of the exciter rectifier (I'm presuming the output is DC volts; it really doesn't make any difference--it has to be adjustable) and the output doesn't seem to want to go below some level, which is keeping the generator terminal volts "high." Either that level is in the CGCM or it's in the device the CGCM is trying tell to lower the output to the generator/field. (We don't know if the unit has field with slip rings to which DC from an exciter rectifier bridge is applied, or if it has a rotating exciter of some type with an adjustable DC output to a stationary field, or what!)
If the "operator" is constantly changing transformer taps and capacitors, it might not be possible to "keep up" with those changes, even if you are using VAr/pf control. You would likely have to build some kind of loop to monitor the VAr's at the substation--not at the generator terminals (which is where I've been assuming you're trying to control the VAr's). You may have tried doing that, but I didn't get that from the previous posts. And if you did and you were still having problems, then again I'm thinking that the generator terminal voltage is at its minimum limit. Maybe that's why the RVAR thing is kicking in--thinking because the exciter voltage is so low that it "must" be in reverse VAr's.
And, WOW! If you change the droop setpoint from +15% to -15% and nothing happened! WOW! Either there's something wrong with the CGCM setup/configuration, or the CGCM isn't working properly (which is something which still hasn't been ruled out!).
Sorry for all the exclamation points. (I'm starting to feel like another poster here at control.com! Oops; I did it again! And again. There; that's better.)
Remember, most generators (not all--but most) are not designed to "take in" too many VAr's. Consult the generator reactive capability curve for the generators at your site. They are usually built to "put out" VAr's because most grids "require" VAr's.
But, I think we're getting somewhere. Point of reference is important. If those tap changes by the "operator" are causing the bus voltage to be excessively low at the generator terminals in order to achieve the desired VAr "flow" at the substation, and the exciter is at minimum, then the generator isn't capable of lowering the terminal voltage any more to get the VAr's down.
It's possible that this is one of those situations that's going to really need some consideration and some trained operations personnel with a good operating procedure to be able to operate properly. There is likely going to have to be some coordination between tap setting(s) and capacitor settings and generator output capability to make it work with respect to the VAr's at the substation. You are likely trying to fight a nearly impossible situation to solve without some help from the "operators" and someone with the City who understands how power generation works.
I can see a situation where to achieve 0 VAr's at the substation, the taps and capacitor bank settings are such that the bus voltage to the generators is "very" low, and if the generator excitation can't be lowered any more, then the generators would be "supplying" Lagging VAr's to the substation. In that case it would seem that the substation "operator" would have to change the taps and/or capacitor bank settings to reduce the VAr's out of the substation which were coming from the generators when the generators were paralleled to the grid. That's a lot of coordination and manual control.
Stranger things have happened. But, I think we're getting somewhere. I'm not sure it's going to help you, especially if you can't get the generator terminal voltage to come down any more. If it's at the "floor", then the transformer taps/capacitor settings have to be changed to have any effect. That might even be why, if the generator terminal voltage is already at bottom that a change to the droop setpoint didn't have any effect.
"I see," said the blind man. (Or is it just an illusion?) Actually, I don't think there's any more I can add to this thread. I'd sure like to know how it all turns out. Wish I could be a fly on the wall when the "consultant" is there.
Best of luck!
Just wanted to give an update on what we have found. The reason for the Generator voltage being higher then the Bus voltage was overlooked on my part, but connection from the bus voltage is single phase and we have 3 phase generator voltage connected to the CGCM. So the SCADA was reading an average of the generator voltage which when we compared Phase A to Phase A the voltage matched.
I ended up implementing PF mode control through the CGCM. Which opened up a whole new world I did'nt even think about. Now that we can produce the VARs through PF mode and the operator does not have to "mess" with the transformer taps and capacitors as much. I gave the ability for them to adjust PF to produce the VARs they need all the way down to .85 PF.
Which made me retune the whole generator operation because now the engine has to work that much harder to produce those VARs @ .85
Thanks for your help on the problem and I am sure I'll come across more.
If anyone would like to check out what we have done we have done a case study on this project.