For colour proofing, especially video and photography,
people still
rely on CRTs (I know I do).
No surprise there. I wouldn't maintain flat-panels are better in *all*
respects!
But for people who don't need that kind of detailed colour fidelity
(which includes me and most people at $DAYJOB, for example), that issue
slides down the priority list, often well below other things.
CRTs are non-linear on the low end of the gamma curve,
yielding
darker and more even blacks/dark greys, while LCDs use a linear gamma
that has a lot of artifacting in the low end.
Artifacting? I just tried a grey wash (black at one end to white at
the other) and the only things I see that could be called artifacts
are, I think, actually Mach bands. (I don't have a good light meter
available to actually measure the emitted light.)
In addition, LCDs' grey contours tend to be
compressed, yielding
sharper, less-attractive gradations (which is why operating systems
like OS X have different settings for antialiasing depending on if
you are using an LCD or a CRT).
You'll need that just because of the different gamma curve. (Unless
the antialiasing code inverts the gamma before and after operation,
you'll need different antialiasing for different gamma curves.)
When I want to determine how an image "actually
looks," I will take
it to my 19" CRT -- and I use Apple LCDs, which are not cheap
displays.
Surely you should be using whatever the image's target will be using,
since that will give you the most accurate impression of what your end
consumers will be seeing?
But, LCDs are getting better. I think LED displays
will be better
yet.
True. CRT displays are a relatively mature technology at the moment;
flat panels are relatively new. (Not that that invalidates current
comparisons; it just means they're likely to change soon.)