I always hated having to explain to students that
"REAL" numbers in
languages such as BASIC were floating point binary approximations, and
were absolutely NOT "REAL" numbers. Until about 6 years ago, we actually
had a course in "Computer Math", in which we tried to teach students
[among other things] how floating point worked. Of course, we had to
THis reminds me of a related issue :
Since almost everybody uses calculators and computers now, it's often
said that learning things like how to do long multiplication and division
is unneccssary for most people (well, unless they want to end up
designing clacualtors and computers...). But of course such amchines are
not mathematically perfect, there are rounding erorrs and the like [1]. I
feel that if you'r not going to teach how to do soemthing by hand becuase
everybody uses a machine to do it, you should instead teach why the
machine doesnt' always get the mathematically correct resualt, and how to
handle the common pitfalls.
[1] Certain calcualators do what I term 'funny rounding' to get what
appears to be the right answer in most cases. I can't stand such
machines, I want to know exactly what my calulator is going to do.
I am conviced that the majority of speadsheet users, for example, haven't
a clue asto what is really going on and that their answers are suspect as
a result.
start by undoing a lot of what they had previously
been taught, such as
that PI was EXACTLY 22/7 ! (half a century ago, in elementary school, I
got into "big trouble" for telling a teacher that PI was NOT 22/7 !)
I have never understood the windespread use of that approximation.
355/113 is more accurate and easier to remember
'How I need a drink, alcholic of course after the heavy lectures involving
quatnm mechanics'
-tony