On Thu, 27 Jul 2006, Dave Dunfield wrote:
In a way it makes sense ... The general public
understands "thousand"
better than "k", so giving them the number of bytes in thousands will
give them a better idea of the actual number than a true "k" value
would - we computer geeks just think differently than everyone else
I find myself counting non-computer related things in hex all the time
(If you are doing it out loud, people nearby look at you strangely
and move slightly further away)
and people get especially weirded out when you
count on your fingers in
binary (up to 1023). Just reaching the number 4 upsets them.
That could take awhile.. :) Now do you do it by saying the single
numbers or the decimal representaion of the number?