On 13 May 2010 at 11:11, les at
frii.com wrote:
If the 100w bulb is not brighter than the 40w bulb,
why would they
make 100w bulbs? Just to waste energy? I realize that watts is not a
measure of brightness, but for any given technology, the higher
wattage bulb is brighter. >> >> But most people will, without
thinking, say "Oh, a 100W bulb is >> brighter than a 40W, so the
answer is obvious." And wrong.
I'll try to explain without going mathematical on you.
In this case, the answer requires a little thought. The current
consumed by a bulb is proportional to the voltage applied and the
bulb's resistance. And the power dissipated is equal to the product
of the current passing through it times the voltage across it.
In a normal application, the voltage is more-or-less a fixed
quantity, so that the power dissipated by the bulb is basically
inversely proportional to the resistance. Lower resistance = higher
power. So a 100W lamp, has a lower resistance than the 40W bulb and
dissipates more power at the same voltage.
But put them in series. There, the voltage is divided across the
lamps in the same ratio as their resistances, so the 40W lamp, having
the higher resistance, also has a higher voltage across it. Since
the current flowing through both lamps is the same (they're in
series), the power dissipated by the 40W lamp and that of the 100W
lamp will be in the same ratio as the respective resistances of the
lamps.
--Chuck