Anonymous asked in Science & MathematicsEngineering · 1 decade ago

Does increasing the thickness of electrical wiring in a home lower energy, meter readings and bills?

I heard someone at an energy conference refer to a study that showed that going to a one size thicker wire in a home -- I am assuming on the American wire gauge, AWG, system -- will lead to a significant drop in home electrical energy usage. I write "significant" because the percentage was so big I can't remember it exactly.


I heard someone at an energy conference refer to a study that showed that going to a one size thicker wire in a home -- I am assuming on the American wire gauge, AWG, system -- will lead to a significant drop in home electrical energy usage. I write "significant" because the percentage was so big I can't remember it exactly.



10 Answers

  • 1 decade ago
    Best Answer

    With the operative word being "significant", the answer for residential applications is "NO".

    There are some losses in the wiring in your home, but based on residential loads, this value is rather minimal.

    All practical conductors have some resistance to them. Ohm's Law tells us that if we pass a current through a "resistor", there is a voltage drop. E = I x R. For example, you might have 120 volts at the service entrance panel, but on the other side of the house the voltage might drop to 117 V when a large appliance is running.

    Some people refer to these as the I²R losses. Current squared times the conductor resistance.

    I recall an article several years ago where they did a case study with the lighting circuits in a commercial building. Since non-residential electric rates include a demand charge, the article described how increasing the wiring by one size could lower the customer's kW demand charge along with lowering the kWh losses associated with the voltage drop in the line. In the example, the customer had dedicated lighting circuits with signifcant load that was on for 8000 hours a year. That's over 90% of the year!

    See the link below for similar examples and calculations. Sorry, but the link from the Copper Development Association (CDA) is not working. I've included the link below in the event it begins working the future.

    --------- "Additional Details" comments

    A 100 watt lightbulb does not always consumes 100 watts. In reality the bulb's filiment has a specific resistance. If the incoming voltage is less than anticipated, the resulting consumption (by the bulb) will be lower than 100 watts.

    To further confuse things... A motor is a constant power device. When the voltage drops, your Central AC motor is going to draw more current to maintain constant power. The increased current increases the I²R losses described above.

    All in all, the short duration usage patterns of residential customers make the savings VERY small. Replacing your wiring would not provide a reasonable payback. Upsizing during new construction may ultimately pay off on some circuits in the home, but be a waste of money on others.

  • 1 decade ago

    Theoretically yes by increasing the wire gauge there will be lower I²r losses (Watts) as the resistance r is lower in wire of greater diameter. HOWEVER, the resistance value of copper wires of just a couple of gauges difference is so small as to not contribute to any difference in the meter reading/monthly bill.

    Considering that an 8 gauge wire has a resistance of 8 ohms per mile and a 12 gauge wire's resistance is 3 ohms per mile, the difference being 5 ohms per mile. Say you had 1 mile of wire in your house (which is impossible) with 10A running constantly. The wire heating loss would be 10² x 5 = 500W or 0.5kW. To have this for 1 hour would consume 0.5kWhrs, at say 10 cents/h = 5 cents.

    Now, there would be not more than 100 feet loop length on a branch circuit so the cost would be 100/5280 x 5cents = 0.01cts/h.

    Conclusion: Your energy conference leader is talking RUBBISH.

    Source(s): Electrical Engineer with 40+ years experience
  • 4 years ago

    As already noted, these are available. They are sometimes known as Meter Socket Blank Covers. Most are plastic these days. Old ones were glass. The meter socket is a standardized dimension. No need to measure. Very unlikely to be found at big box stores. The power company may not be a bad idea (may even get it for free). The other likely place is going to be at an electrical supply store (Some names that may be in your area are: Platt, All-Phase, Consolidated Electrical Distributors/CED, Stusser, Graybar). Of course, the suggestion to remove it and repair the hole was not bad either. I have also seen several other answers from Mr. 50 yrs retired electrician. I'm not sure where he did his 50 yrs. Not likely anywhere that the NEC was enforced. If I'm wrong, I should move and start finding his old customers. I could spend the rest of my career fixing work like that.

  • 1 decade ago

    Depends on how many amps you are using in the first place, I assume.

    If you underestimated the amount of current usage in a particular circuit in the first place, and you undersized the wire as a result, then I suppose that the smaller wire will have more apparent resistance due to it being unable to handle the current load without over hearting.

    They seem to be suggesting that the standard gage wire used in American homes is undersized because our standards are off and the electrical code is wrong.

  • How do you think about the answers? You can sign in to vote the answer.
  • Anonymous
    1 decade ago

    In the case of a light bulb bigger wire will cause the same light bulb to use more power. Since the voltage drop to the light bulb would be greater with smaller wire, hence voltage at the bulb would be lower. Power=V^2/R would be less for a light bulb using smaller wire. The bulb would be dimmer with less voltage to it.

    With a heavily loaded motor the results would be different due to the effect of reduced voltage on the speed torque curve.

    I think the people who suggest that this would save energy are in the business of selling copper!

  • Anonymous
    1 decade ago

    I will tell you that it does not matter what you use to get the electricity somewhere. If you turn a light on, it doesnt matter what wire it goes through, it still needs the same amount of electricity. I work in this business and I have seen contractors wiring houses in 14 gauge wire..In my opinion, this is a bad idea, houses should be wired up with 12-2 romex. If you use 14, youll just have to make shorter runs, which means more breakers and bigger panels than are really necessary.

  • Irv S
    Lv 7
    1 decade ago

    Since you're talking about the U.S., the trend here is to use a lot of #14awg., (15A.), circuits. There is some greater heat loss in such circuits in normal use compared to #12 wire.

    Most circuits, (receptacles) "stand & wait" most of the time, so going to #12 wire would not be economically sound.

    For larger constant loads, (refrigerators etc.), a small savings in electrical energy might be realised, but it would only be applicable in summer as the heat produced in winter goes toward heating the house.

    For A.C., refrigerator, and air handler circuits, the larger conductors might make sense as energy costs increas, but not for all wiring.

    You'd get a lot more "bang for the buck" by swapping that 100W. lightbulb for a 27W. fluorescent.

  • Anonymous
    1 decade ago

    No, Your still gonna use the same amount electricity. Its kinda like water, if you were gonna fill a 5 gallon bucket its still requires 5 gallons to fill up whether you use a straw or a firehose.

  • ?
    Lv 7
    1 decade ago

    Erm, no, it won't.

    The resistance of the wire is negligible.

  • Anonymous
    1 decade ago

    current thru thinner wire creates heat,making resistance,so it is possible...

Still have questions? Get your answers by asking now.