# Which is brighter: a 60W bulb dimmed to 25W or a 25W bulb at full brightness? Explain with math if you can.?

Assume that both bulbs are similar in every other way. I'm looking for maximum brightness on a limited budget and power supply. Incandescent is the only option.

Relevance

Let's be clear, here: Luminosity, "lumen output", "light output", or whatever you want to call the light intensity of a bulb is not a direct function of filament temperature. That's a gross ignorance and nothing more.

Incandescents are rated by the power they use and nothing else. The color temperature is a function of the filament and the power applied. Most filaments are tungsten so the filament can never heat above 3,700 Kelvin, the melting point of tungsten.

So, most filaments are limited to 2,000 to 3,300 Kelvin. This is the resulting color temperature and why some lights are whiter and some are more yellow., the latter tending to last longer since they are cooler or "warm" light.

In other words, a 100 watt bulb and a 40 watt bulb operate at the SAME filament temperature if they are both the same type of bulb, but the 100 watt bulb produces more of it. The "light output" , is not a function of filament temperature as veeyesvee and Tony RB asserted.

So, a 60 watt bulb, Tony, better produce more lumens than a 25 watt bulb, or what's the point of the other 35 watts? Can someone design a really ineffecient transducer? Sure... a toaster makes a terrible light source considering the power it consumes.

The real answer lies in the fact that the filament is not a linear resistance ( see below).

The hotter the bulb gets, the more efficient it becomes in producing light and a 60 Watt bulb is about 14.2 lumens per watt where a 25 watt bulb is 8 lumens/watt.

The amount of light produced by a bulb depends on the ratings vs applied. Since all you can vary is the voltage:

Raise the voltage ratio to the 3.5 power to get equivalent luminosity.

Life would be the -12th power of the voltage ratio.

And current is the .55 power of the voltage ratio.

(V2/V1)^3.5 = .235 (since 200lm/850lm = .235)

(V2/V1)^3.5 = .235 ==> V2 = 120 * .661 = 79.34 volts

(V2/V1)^.55 = I2/.5 (60/120 = .5) ==> I2 = .398 A

79.34 x .398 = 31.6 W

OR V2/V1 * (V2/V1) ^.55 = .52 power ratio and 60 * .52 = 31.6 W

So, the 60 watt bulb would need 32 watts of power to generate 200 lumen.

Source(s): 30 years engineering
• When incandescent bulbs are rated by "60 W" and "25 W" that is the power drawn by the bulb at the full rated voltage of the bulb.

It does not describe the lumen output of the bulb.

Lumen output is determined by the temperature of the filament.

If you were to run an experiment, you will see that when the power going through a 60 W bulb is the same as the power going through a 25 W bulb, the light brightness will be lower for the 60 W bulb than for the 25 watt bulb.

Why is that ?

See the answer provided by "veeyesvee".

veeye - no the temperature is not 91800deg its around 3200k

However

"I'm looking for maximum brightness on a limited budget and power supply. Incandescent is the only option."

maximum brightness on a limited .. power supply implies high efficiency. Your best option for this in an incandescent lamp is a 20W halogen bulb. And ideally a xenon-filled halogen bulb.

But why choose incandescent?

The best 20W incandescent bulb (as above) will give about 25 lumens per watt ie 500 lumens.

High intensity discharge lamps give 80 lumens per watt - 20W = 1600 lumens

and high power LEDs such as the Luminus SST-90 give around 120 lumens / watt = 2400 lumens

• The light output of a bulb depends on the temperature that the filament reaches which in turn depends on the power applied to it. the 60watt bulb is rated for 60 watts and the filament will reach the expected temperature 91800deg??) when 60 watt is applied. With 25W on a 60 watt bulb, the filament will not reach this temperature and will just produce a red light if at all. Obviously the 25watt bulb with 25 watt applied will be brighter.