If the 100w bulb is not brighter than the 40w bulb, why would they make
100w bulbs? Just to waste energy? I realize that watts is not a measure
If you run a 100W bulb at its rated voltage, it is going to be brighter
than a similar-technology 40W bulb also run at its rated voltage. I don;t
think anyone is disputing that.
And if you connect 2 bulbs of the same voltage in parallel (say 2
mains-voltage bulbs) and connect then to a source of that voltage, then
the 100Q one will be brighter than the 40W one. Again, we all agree on that.
But if you connect the bulbs _in series_, you will find that the 40W one
is brighter. It's not disipating 40W, and the 100W one is not disipating
anything like 100W.
If you assume the bulbs are constant resistave (which is incorrect, but
see later), siunce they are passing the same current (they are in
series), then the power disipated is proportional to the resistance, And
a 40W bulb has a higher resistnce than a 100W bulb. In fact bulbs have a
positive temperature coefficient of resistance which means when you
connect them in series like this, the 40W bulb heats up, its resistance
increastes to it disipates even more power, while the 100W bulb stays
cool and its reisstance remains low, and it disipates very little power.
Try it....
-tony