A
In Texas we have recently dereulated the Texas Utilities and they have decided to start a tarrif for power factors under .95.
So in an effort to show the company I work for what the future costs are going to be I was calculating the input power of our motors with several different power factors. What I saw shocked my little eyeballs. As the power factor got closer to .95 the more it costs to run it. Now how in the world did this happen? Is the formula for input power wrong? Seems not to be from what I have found So far. Did I use the wrong number somewhere? Thats possible, i used .70, .85 and .95 for my power factor instead 70, 85 and 95. But either way yields the same basic result, more costs as one gets a higher Pf. Or am I just seriously missing something?
I would appreciate someone helping clear this up for me.
I am using the standard P= V*I*PF*1.732/1000 for my power equation
So in an effort to show the company I work for what the future costs are going to be I was calculating the input power of our motors with several different power factors. What I saw shocked my little eyeballs. As the power factor got closer to .95 the more it costs to run it. Now how in the world did this happen? Is the formula for input power wrong? Seems not to be from what I have found So far. Did I use the wrong number somewhere? Thats possible, i used .70, .85 and .95 for my power factor instead 70, 85 and 95. But either way yields the same basic result, more costs as one gets a higher Pf. Or am I just seriously missing something?
I would appreciate someone helping clear this up for me.
I am using the standard P= V*I*PF*1.732/1000 for my power equation