Dave Dunfield wrote:
In a way it makes sense ... The general public
understands "thousand"
better than "k", so giving them the number of bytes in thousands will
give them a better idea of the actual number than a true "k" value
would - we computer geeks just think differently than everyone else
No we don't. We've just taken the trouble to learn something about the tools
that we use.
At some point it has to be accepted that people need a little education to use
a computer - so I don't see why a rudimentary understanding of how computers
operate shouldn't be one of them.
Heck, it's not even as difficult as that - I expect the average user who just
does a bit of email, surfing and word processing doesn't even need to know -
providing they know the *rough* relationships between the units that's enough.
But it's no excuse for the people bringing the computers to those people to go
around trying to redefine fundamental principles of how they work in order to
somehow make them more warm and fuzzy.
--
(\__/)
(='.'=) This is Bunny. Copy and paste bunny into your
(")_(") signature to help him gain world domination.