ferroresonance voltage stabilizer

sjh

aka - shadley
Jan 1, 2002
969
2
0
52
www.geocities.com
2. you will be hard pressed to find a ground wire in use in this country..
3. no, you will increase your bill signifigantly. you are billed by the KWh which is calculated by measureing amp hours and multiplying by 120. if you have a voltage regulator and the input voltage is 90, your amps will increase by 33% and therefore the calculated KWhs by 33 percent, even thought the actual KWh is almost the same.
 

elkangorito

New member
Sep 24, 2007
40
0
0
You are assuming that the voltage regulator is manually operated. Also, you are assuming that the regulator can operate with a 120v input even though it is set at 90v (as an input). Under these circumstances, I don't think that the regulator will operate satisfactorily. In other words, I think that it will go into "fault".

As far as power is concerned & unless the regulator has the ability to dissipate the "unused" power (which I don't think it can do), it will not be affected. Quite simply, power in equals power out.
 

sjh

aka - shadley
Jan 1, 2002
969
2
0
52
www.geocities.com
the problem is not that the power usage changes, nor does it have anything to do with manual vs automatic. the problem is how the power usage is measured. the electric company measures how many amp-hours you use not how many watt-hours. to convert the one to the other they multiply by the assumed voltage of 120V.

example: voltage in 90V, amperage out 15 amps for one hour = 1350 watt hours

except the electric company does it this way
Assume voltage in is 120V and the meter measured 15 amps for one hour.

so they charge you for 1800 watt hours which is 33% more than you actually used.

I know some people who put in a large automatic voltage regulator and were furious when they elec bill was coming in at 50% more than before.
 

elkangorito

New member
Sep 24, 2007
40
0
0
the problem is not that the power usage changes, nor does it have anything to do with manual vs automatic. the problem is how the power usage is measured. the electric company measures how many amp-hours you use not how many watt-hours. to convert the one to the other they multiply by the assumed voltage of 120V.

example: voltage in 90V, amperage out 15 amps for one hour = 1350 watt hours

except the electric company does it this way
Assume voltage in is 120V and the meter measured 15 amps for one hour.

so they charge you for 1800 watt hours which is 33% more than you actually used.

I know some people who put in a large automatic voltage regulator and were furious when they elec bill was coming in at 50% more than before.
I have 2 questions;

1] What type of meters are used? Eg Induction disc or otherwise.
2] Is the meter on the line side or the load side of the regulator?
 

elkangorito

New member
Sep 24, 2007
40
0
0
1, i dont know. I am not an electrical engineer.
2, line side, if it was load side then it wouldnt change the bill


The electricity company does not measure Amp-hours - the kiloWatt hour meter measures kiloWatt hours. A kiloWatt hour meter does not "assume" anything. It uses 2 coils, a voltage coil & a current coil to create magnetic flux, which spins an aluminium disc, which in turn is connected to small counting dials. Eg if the input voltage is 90v & 20 Amps is being drawn for 1 hour, the meter will register that 1.8 kiloWatt hours have been used (ignoring Power Factor). In other words, providing that the voltage & current stay within acceptable limits (not damage the coils), the kiloWatt hour meter will record the correct kWh usage.
Late model kWh meters are electronic.

Also, Automatic Voltage Regulators/Stabilsers are usually electronic, very efficient (>90%) & can usually handle an input voltage variation of about +/-25% of its rated input voltage, without a problem.

I know some people who put in a large automatic voltage regulator and were furious when they elec bill was coming in at 50% more than before.
I doubt that the cause of such a high electric bill was the voltage regulator. There is more to that problem than meets the eye.
 

Bolt

New member
Jun 12, 2002
94
0
0
www.wireless-alarms.com
Mmm yes it will increase the bill and i tell you why. Lets say your running 10 lights at 100 watts each at 100 volts = 1000 watts

Now next week your only getting 50 volts not 100 so now your power consumption is 500 watts and you will only pay for 500 watts because thats what you used. Of course many things wont run this low but its for arguments sake in practice 90 volts instead of 110 is common.

So now you go out and buy one of these regulators and what it will do is beef the voltage from 90 to 110 so now your lights are brighter but you are using a bit over 1000 watts of power because the current drawn at 90 volts is increased to give you the power you asked for plus converter efficiency might need 50 watts for itself. So the motto is if you were previously under volts the converter will INCREASE your bill proportionate to how bad your supply was before.

The best way to protect your precious electronics in DR is via an inverter. The good ones have line conditioning circuits and filters but will switch into circuit on dips ie < 87 volts protecting your PC supply from brown outs. Likewise surges over 130 volts will be suppressed and if the condition lasts more then a few cycles then the inverter will take over your supply to deliver 110. Its ironic but the inverter will turn on to battery supply during over volts to stabilize your house supply.

Just about everyone i know without using an inverter as a filter at sometime in recent years have lost hundreds in blown electronics, TV's radio, PC's even power saver bulbs explode out the sockets so an inverter is not just for when you don't have power its good for when you do too. What i did is loop everything through the inverter except the water heater just take that out to the live side.
 

elkangorito

New member
Sep 24, 2007
40
0
0
My comments in red.


Mmm yes it will increase the bill and i tell you why. Lets say your running 10 lights at 100 watts each at 100 volts = 1000 watts.

Agreed.

Now next week your only getting 50 volts not 100 so now your power consumption is 500 watts and you will only pay for 500 watts because thats what you used. Of course many things wont run this low but its for arguments sake in practice 90 volts instead of 110 is common.

Also agreed.

So now you go out and buy one of these regulators and what it will do is beef the voltage from 90 to 110 so now your lights are brighter but you are using a bit over 1000 watts of power because the current drawn at 90 volts is increased to give you the power you asked for plus converter efficiency might need 50 watts for itself. So the motto is if you were previously under volts the converter will INCREASE your bill proportionate to how bad your supply was before.

Incorrect. The only "bill increase" will be the inefficiency of the regulator, which you stated as being a presumed 50 Watts. This maximum regulator "loss" will remain constant under all acceptable load conditions. 50 Watts for a 1 kW regulator is significant but 50 Watts for a 5 kW regulator is insignificant.

The best way to protect your precious electronics in DR is via an inverter. The good ones have line conditioning circuits and filters but will switch into circuit on dips ie < 87 volts protecting your PC supply from brown outs. Likewise surges over 130 volts will be suppressed and if the condition lasts more then a few cycles then the inverter will take over your supply to deliver 110. Its ironic but the inverter will turn on to battery supply during over volts to stabilize your house supply.

It's not ironic...it's doing what it was designed to do.

Just about everyone i know without using an inverter as a filter at sometime in recent years have lost hundreds in blown electronics, TV's radio, PC's even power saver bulbs explode out the sockets so an inverter is not just for when you don't have power its good for when you do too. What i did is loop everything through the inverter except the water heater just take that out to the live side.

There is a huge difference between an inverter/battery arrangement & a voltage regulator arrangement, that being the cost of batteries.
Voltage regulators are not designed to deal with extreme & long lasting supply discrepancies. Only an inverter/battery arrangement can truly deal with extreme loss of supply. If the loss of supply is not important but the quality of available supply is important, a voltage regulator is the cheapest & best way to deal with this problem. This is all about choosing the right equipment for the right job. If the supply is intermittent, a combination inverter/battery setup is indicated. If loss of supply is not important but quality of available supply is important, a voltage regulator is indicated
.

You still haven't explained how a voltage regulator will increase your power bill except for mentioning a 50 Watt loss.