efficiency question re: baseboard heater: 220 or 110?

Someone told me that a 220 baseboard heater was more efficient. I am not and electrician nor engineer. But it would seem that a kilowatt of heat does the same regardless of voltage.

You are correct. If a certain appliance says 2000 watts, 110v/220v meaning it can be hooked up to either 110volt or 220 volt. Both will still draw 2000 watts to operate which means you still get charged the same from the Electric company. No saving in cost.

If you have a long run for the wire the 220volt will get to the point of use more efficiently. You can also use a lower gauge wire with the 220 volt source along with lower amp breakers which are typically cheaper.

Watts equals volts X amps. A 1500 watt heater on 120/v would pull 12.5 amps, while if it was run on 240 it would be 6.25. On something that small the difference is going to be so neglible that it would hardly be noticeable. But on a larger appliance that pulls much more amperage, it makes a big difference.

For instance, a stove on 240 volts usually uses a 60 amp breaker and #6 wire. If that same stove ran on 120 volt, it would require a 120 amp breaker and #3 wire.

That's why the larger appliances like the stoves, central heat/air, and water heaters always operate on 240.

Watts equals volts X amps. A 1500 watt heater on 120/v would pull 12.5 amps, while if it was run on 240 it would be 6.25. On something that small the difference is going to be so neglible that it would hardly be noticeable. But on a larger appliance that pulls much more amperage, it makes a big difference.

For instance, a stove on 240 volts usually uses a 60 amp breaker and #6 wire. If that same stove ran on 120 volt, it would require a 120 amp breaker and #3 wire.

That's why the larger appliances like the stoves, central heat/air, and water heaters always operate on 240.

I agree with Agmantoo. If a long run is what we're talking about than I would go with 220V because of the lower current draw and lower voltage drop in the line. 120V will run at a higher current causing more of a voltage drop in the line. Too much of a drop could cause a motor to self destruct but we don't have a motor in this case. Still could be a problem with a long run. It has nothing to do with efficiency. A watts a watt.

__________________

"Knowledge didn't hatch out on a flat rock." Clayton Peary

Efficiency for single phase current will be the same, regardless of whether it's 110v or 220v. There is, however, an efficiency advantage with three phase current over single phase current, but three phase current is not normally available to residential customers.

Well I *am* an electrical engineer so here is the deal:

As far as watts in to the heater versus heat out, both are 100% efficient. In a resistance heater you get 3,413 BTU from one Kilowatt-Hour of electricity. The 240v heater will draw half the current of the 120v heater, but the watts are the same, so there will be no savings in kwh usage on your power bill.

However, the fact that the 240v heater draws less amps does have a couple of benefits due to the fact that it's the current in a wire that determines the voltage drop. This means 1. You can sometimes use a smaller gauge wire for your heating circuit on 240v and 2. The power dissipated by the wire (which you are paying for) will also be less.

So let's do some math to see what this means:

Let's suppose you have 1500 watts of baseboard heaters on one circuit 50 feet from the main panel. On 120v, these heaters will draw 12.5 amps, on 240v they will draw 6.25 amps. You would need to run 14ga wire for either circuit, so you have no savings on the cost of wire. The voltage drop in the wire for the 120v heater is 3.2% and for the 240v heater is 0.8%. The voltage drop is essentially "lost" power that is dissipated in the wire and doesn't "make it" to the heater. If we assume the heater is on 50% of the time and you pay $0.10 per kwh, the 120v circuit will waste $1.73 per month while the 240v circuit will waste only $0.43. So you will save $1.30 per month running the heater on 240v.

Now if that same circuit were 100 feet, you'd need to run 12ga for the 120v circuit to keep the voltage drop under 5% which is the maximum allowed per NEC (though 3% is suggested). The cost difference in this case would be a savings on installation of maybe $20-30.

So in general it's better to install an appliance at 240v unless you have a compelling reason to run 120v. Possible reasons might be if you want to be able to run the appliance on a 120v only generator. Or if you don't have space in your panel for a double pole breaker for 240v, but could fit a single pole breaker for 120v.

Cfabe, you forgot to mention that would be 6.25 amps on each leg of the 220v. 6.25 amps plus 6.25 amps equals 12.5 amps. .08 percent voltage drop on each leg would total 1.6 percent. Still the same amperage and wattage regardless. Also 1500 watts would exceed the 80 percent rating of a 14 guage circuit so therefore you would have to up to a 12 guage anyway. Meaning less voltage drop on the 110 volt circuit if mldollins went that route.

Cfabe, you forgot to mention that would be 6.25 amps on each leg of the 220v. 6.25 amps plus 6.25 amps equals 12.5 amps. .08 percent voltage drop on each leg would total 1.6 percent. Still the same amperage and wattage regardless. Also 1500 watts would exceed the 80 percent rating of a 14 guage circuit so therefore you would have to up to a 12 guage anyway. Meaning less voltage drop on the 110 volt circuit if mldollins went that route.

You got me on the 80%, you would have to run 12ga for 1500 watts on 120V if you were going by the code exactly. If it was run on 12ga the drop would be 2% instead of 3.2%. Still a lot higher than the 240v option, even with the larger wire.

In my calculations there is a "voltage drop on each leg" for both voltages. In 120v the neutral carries just as much current as the hot so it has voltage drop even though it's not a "hot". The % drop on the 240v is a quarter of that on the 120v because the amps are half, and also the volts are twice as high so the actual volts of drop are half as large, but from twice the voltage.

Efficiency for single phase current will be the same, regardless of whether it's 110v or 220v. There is, however, an efficiency advantage with three phase current over single phase current, but three phase current is not normally available to residential customers.

We have three phase to run our business and although cheaper to operate the electric company charges business rates for 3 phase which is higher then home use So we dont see any savings.

To calculate the power lost in transmission, you calculate the power with this formula (written in long hand due to keyboard limitations):

P= (IxIxR) where P= Power lost I= current and R = line resistance.

Since R is a constant with the same wire, current flow becomes the dominant factor in efficiency. Obvioulsy, the higher current flow in the 110v system is much more inefficient. In the example given above, the 120v system uses 12.5 amps. 12.5 squared equals 156.25. The current flow through the 240 volt system is 6.25 amps, and 6.25 squared equals 39.0625. Plug these numbers into the formula with the R value for the wire used, and you can see why the 240v system is more efficient.

__________________

Only the paranoid survive.

I carry a gun because a cop is too heavy.

"Since I'm in charge, obviously, we screwed up." -Barck Obama