How much electricity does a 40w light bulb use per hour?
Also how much do other common wattages use?
No kidding omer. I meant kw/h. Something easy to estimate the cost.
- 1 decade agoFavorite Answer
It was not Omer’s fault your question was not that clear.
By the why the real answer ZERO, because my 40-watt bulb is not turned on.
Maybe the question should have read:
If I have a 40w light bulb, in a lamp that is turned on, how many kilowatt-hours of electricity does it use?
- JoanLv 44 years ago
Fluorescent bulbs (compact or otherwise) give off more light per watt. The TRUE wattage is the amount of power the bulb consumes. Many CF bulbs however are marketed with the approximate wattage that an equivalent brightness incandescent bulb takes ( you'll find the real wattage in the fine print). A 4 foot long tube is typically between 15-40 watts. So depending on it's rating it coudl be more or less than a 40 watt incandescent bulb. A "replaces 40W " CF bulb really only consumes around 11W.
- John MLv 41 decade ago
Electrical consumption is measured in Watts.
Voltage is the equivalent to the pressure of the electricity (110 Volts, 220 volts etc..)
Amperage (amps) is the current of the electricity.
The power company bills you on Watts consumed (Kilowatt Hours). A Kilowatt Hour is 1000 watts per hour.
So a 40 watt light bulb (if rated at the Voltage of 110 volts) would use 40 watts per hour. If the rated voltage for the 40 watt lightbulb was 220 volts, it would still use 40 watts of electricity, but it wouldn't draw the same amperage.
Now, if you are trying to figure out the amperage the 40 watt lightbulb is going to use, there is a simple formula to use. It is a P.I.E. formula. P= Wattage, I=Amperage E=Voltage.
To figure out the amperage of your 40 watt bulb take the wattage value (40) and divide it by the Voltage (110 volts for example) and you'll get .364 (rounded) amps. So on a 15 amp circuit you could put about 41 lightbulbs before you pop the breaker.
If you want to figure out your wattage and you know the amperage draw and the voltage, you multiply the voltage and the amperage and it will give you the wattage.
Finally, if you know the amperage draw and the Wattage. Divide the Wattage by the amperage and it will give you the Voltage.
This is a lot of info for such a simple question....but now you know.
- How do you think about the answers? You can sign in to vote the answer.
- John himselfLv 61 decade ago
watts times hours = watt hours. Watt hours divided by 1000 = kilowatt hours.
So 40 watts for 1 hour is 40 watt hours. Burning for 1000 hours would be 40 KWH.
Substitute any other wattage you wish.
My power costs 8 cents per KWH. So that 40 watt bulb costs .0032 cents per hour.
Some of the other answers are a hoot.Source(s): I'm an electrician
- 4 years ago
40 watt per hour means consuming 40 watt in hour
- Max SchnellLv 61 decade ago
Ask what time it is and you get a discourse on how clocks work :-)
All you have to do is divide the wattage by 1000 to get KWh used (per hour).
Multiply this by your KWh rate and you'll get the cost.
- Anonymous1 decade ago
if you are asking about how much would you pay.. it would depend on how much is per KWH (kilowatt-hour) is being charged to you. The bulb for example could have a rating of 100Watt/hour.. therefore, if we charge 10$ per kilowatt, youwould be paying 1$ per hour for your light bulb.
ask your electrcians about the different ratings of all your appliances...
if you are looking for the current.. it would be P = I*V
v = voltage (110 or 220v) depend on the place.
- Yawn GnomeLv 71 decade ago
The .gov search produces some good dataums..
- 6 years ago
don't use power... it costs a **** load of money
- 1 decade ago
jason handled this one for you