1 - 16 of 16 Posts

Joined

·
4,443 Posts

Watts equals volts X amps. A 1500 watt heater on 120/v would pull 12.5 amps, while if it was run on 240 it would be 6.25. On something that small the difference is going to be so neglible that it would hardly be noticeable. But on a larger appliance that pulls much more amperage, it makes a big difference.

For instance, a stove on 240 volts usually uses a 60 amp breaker and #6 wire. If that same stove ran on 120 volt, it would require a 120 amp breaker and #3 wire.

That's why the larger appliances like the stoves, central heat/air, and water heaters always operate on 240.

Joined

·
259 Posts

I agree with Agmantoo. If a long run is what we're talking about than I would go with 220V because of the lower current draw and lower voltage drop in the line. 120V will run at a higher current causing more of a voltage drop in the line. Too much of a drop could cause a motor to self destruct but we don't have a motor in this case. Still could be a problem with a long run. It has nothing to do with efficiency. A watts a watt.:cowboy:

Watts equals volts X amps. A 1500 watt heater on 120/v would pull 12.5 amps, while if it was run on 240 it would be 6.25. On something that small the difference is going to be so neglible that it would hardly be noticeable. But on a larger appliance that pulls much more amperage, it makes a big difference.

For instance, a stove on 240 volts usually uses a 60 amp breaker and #6 wire. If that same stove ran on 120 volt, it would require a 120 amp breaker and #3 wire.

That's why the larger appliances like the stoves, central heat/air, and water heaters always operate on 240.

Joined

·
53,243 Posts

As far as watts in to the heater versus heat out, both are 100% efficient. In a resistance heater you get 3,413 BTU from one Kilowatt-Hour of electricity. The 240v heater will draw half the current of the 120v heater, but the watts are the same, so there will be no savings in kwh usage on your power bill.

However, the fact that the 240v heater draws less amps does have a couple of benefits due to the fact that it's the current in a wire that determines the voltage drop. This means 1. You can sometimes use a smaller gauge wire for your heating circuit on 240v and 2. The power dissipated by the wire (which you are paying for) will also be less.

So let's do some math to see what this means:

Let's suppose you have 1500 watts of baseboard heaters on one circuit 50 feet from the main panel. On 120v, these heaters will draw 12.5 amps, on 240v they will draw 6.25 amps. You would need to run 14ga wire for either circuit, so you have no savings on the cost of wire. The voltage drop in the wire for the 120v heater is 3.2% and for the 240v heater is 0.8%. The voltage drop is essentially "lost" power that is dissipated in the wire and doesn't "make it" to the heater. If we assume the heater is on 50% of the time and you pay $0.10 per kwh, the 120v circuit will waste $1.73 per month while the 240v circuit will waste only $0.43. So you will save $1.30 per month running the heater on 240v.

Now if that same circuit were 100 feet, you'd need to run 12ga for the 120v circuit to keep the voltage drop under 5% which is the maximum allowed per NEC (though 3% is suggested). The cost difference in this case would be a savings on installation of maybe $20-30.

So in general it's better to install an appliance at 240v unless you have a compelling reason to run 120v. Possible reasons might be if you want to be able to run the appliance on a 120v only generator. Or if you don't have space in your panel for a double pole breaker for 240v, but could fit a single pole breaker for 120v.

Hope this helps, sorry the post was so long.

Joined

·
4,443 Posts

Nope.Cfabe, you forgot to mention that would be 6.25 amps on each leg of the 220v. 6.25 amps plus 6.25 amps equals 12.5 amps.

Still the same amperage and wattage regardless.

You only check the amperage on one leg. If you check the amperage on one leg of 110, you get 12.5. You don't add the amperage on the legs.

It IS the same wattage, but it is NOT the same amperage.

Watts equals volts X amps. If you divide any wattage by any voltage that gives you the amps drawn. If you double the voltage you cut the amps in half.

You got me on the 80%, you would have to run 12ga for 1500 watts on 120V if you were going by the code exactly. If it was run on 12ga the drop would be 2% instead of 3.2%. Still a lot higher than the 240v option, even with the larger wire.

In my calculations there is a "voltage drop on each leg" for both voltages. In 120v the neutral carries just as much current as the hot so it has voltage drop even though it's not a "hot". The % drop on the 240v is a quarter of that on the 120v because the amps are half, and also the volts are twice as high so the actual volts of drop are half as large, but from twice the voltage.

We have three phase to run our business and although cheaper to operate the electric company charges business rates for 3 phase which is higher then home use So we dont see any savings.

Joined

·
10,639 Posts

P= (IxIxR) where P= Power lost I= current and R = line resistance.

Since R is a constant with the same wire, current flow becomes the dominant factor in efficiency. Obvioulsy, the higher current flow in the 110v system is much more inefficient. In the example given above, the 120v system uses 12.5 amps. 12.5 squared equals 156.25. The current flow through the 240 volt system is 6.25 amps, and 6.25 squared equals 39.0625. Plug these numbers into the formula with the R value for the wire used, and you can see why the 240v system is more efficient.

Well, I'm "not" am electrical engineer, or an electrician.

As far as watts in to the heater versus heat out, both are 100% efficient. In a resistance heater you get 3,413 BTU from one Kilowatt-Hour of electricity. The 240v heater will draw half the current of the 120v heater, but the watts are the same, so there will be no savings in kwh usage on your power bill.

However, the fact that the 240v heater draws less amps does have a couple of benefits due to the fact that it's the current in a wire that determines the voltage drop. This means 1. You can sometimes use a smaller gauge wire for your heating circuit on 240v and 2. The power dissipated by the wire (which you are paying for) will also be less.

So let's do some math to see what this means:

Let's suppose you have 1500 watts of baseboard heaters on one circuit 50 feet from the main panel. On 120v, these heaters will draw 12.5 amps, on 240v they will draw 6.25 amps. You would need to run 14ga wire for either circuit, so you have no savings on the cost of wire. The voltage drop in the wire for the 120v heater is 3.2% and for the 240v heater is 0.8%. The voltage drop is essentially "lost" power that is dissipated in the wire and doesn't "make it" to the heater. If we assume the heater is on 50% of the time and you pay $0.10 per kwh, the 120v circuit will waste $1.73 per month while the 240v circuit will waste only $0.43. So you will save $1.30 per month running the heater on 240v.

Now if that same circuit were 100 feet, you'd need to run 12ga for the 120v circuit to keep the voltage drop under 5% which is the maximum allowed per NEC (though 3% is suggested). The cost difference in this case would be a savings on installation of maybe $20-30.

So in general it's better to install an appliance at 240v unless you have a compelling reason to run 120v. Possible reasons might be if you want to be able to run the appliance on a 120v only generator. Or if you don't have space in your panel for a double pole breaker for 240v, but could fit a single pole breaker for 120v.

Hope this helps, sorry the post was so long.

I'm a homeowner doing new construction and looking for advice.

I have a question about the advice given, though.

See, I remember Physics, this thing called the conservation of energy, where energy cannot be created or destroyed. It only changes the type of energy.

Ie, kinetic energy to gravitational potential energy, gravitational potential energy to kinetic energy, electrical potential difference to heat/light/electromagnetically induced motion. (Galvanotaxis)

So my question is this, where is the energy "lost" to in this inefficient 120v system?

Could it be in the form of heat?

If the energy lost in heating appliance circuits is also in the form of heat, is that actually a loss in the efficiency of the circuit?

Is the conductor simply another heating element, and is higher current and lower voltage actually more beneficial for heating purposes since there is greater resistance thus more heat?

The conservation of energy theory you learned in high school holds true in most situations you are likely to encounter. The one place you will see this theory fail is if you are experiencing a nuclear explosion. A tiny bit of matter is converted to energy. At that point, of course, you don't care.Well, I'm "not" am electrical engineer, or an electrician.

I'm a homeowner doing new construction and looking for advice.

I have a question about the advice given, though.

See, I remember Physics, this thing called the conservation of energy, where energy cannot be created or destroyed. It only changes the type of energy.

Ie, kinetic energy to gravitational potential energy, gravitational potential energy to kinetic energy, electrical potential difference to heat/light/electromagnetically induced motion. (Galvanotaxis)

So my question is this, where is the energy "lost" to in this inefficient 120v system?

Could it be in the form of heat?

If the energy lost in heating appliance circuits is also in the form of heat, is that actually a loss in the efficiency of the circuit?

Is the conductor simply another heating element, and is higher current and lower voltage actually more beneficial for heating purposes since there is greater resistance thus more heat?

The electricity lost to resistance in the wire is converted to heat. Not a big deal when your intention is to convert all the electricity to heat anyway. The line loss does matter if the object you are powering up needs the electricity to run a motor or provide heat at a specific place, like a stove.

That is correct. In a 120 VAC circuit the current flows in and out of the LINE wire (depending on the phase angle of the AC signal), thru the load, and to and from the NEUTRAL wire. In a 240 VAC circuit there are a pair of LINE wires (L1 and L2) that are 180 degrees out of phase with each other (they come off opposite ends of the transformer secondary winding). They are hooked up in series with the load, just like the LINE and NEUTRAL lines of a 120 VAC circuit. What flows out of L1 flows into L2, and vice versa. So it's the same current everywhere in the circuit.Nope.

You only check the amperage on one leg. If you check the amperage on one leg of 110, you get 12.5. You don't add the amperage on the legs.

It IS the same wattage, but it is NOT the same amperage.

Watts equals volts X amps. If you divide any wattage by any voltage that gives you the amps drawn. If you double the voltage you cut the amps in half.

As shown above, for a given amount of power to be delivered to the load, there will be half as much current in a 240 VAC circuit as in a 120 VAC circuit. So running twice the current (and dropping twice the voltage) thru the wires in a 120 VAC circuit leaves less for the load. So you may have to use heavier wire, with lower resistance, to reduce the voltage drop in the wire if your load can't tolerate the lower voltage.

1 - 16 of 16 Posts

Join the discussion

Homesteading Forum

A forum community dedicated to living sustainably and self sufficiently. Come join the discussion about livestock, farming, gardening, DIY projects, hobbies, recipes, styles, reviews, accessories, classifieds, and more!

Full Forum Listing
Explore Our Forums

Recommended Communities

Join now to ask and comment!