On 19 Mar 2010 at 18:48, Tony Duell wrote:
I am talking about microcontrollers, not desktop
computer CPUs...
Have you looked at some of the high-end microcontrollers lately?
Err no. At one time we had these wonderful things called 'data books' I'd
flip through them to see what chips were available, their featurs, etc. I
bought a lot of said books.
Now I am expected to get data from the manufacturer's web site. I can't
'flip throuugh' a web site like I could the data book. Oh, the web sites
are great if I want data on a partioclar device, but not as good if you
just want to see what's avaialbe. And no, the 'product selector' pages
never seem to tell me what I need to know...
However, I stnad by what I said. For the sort of applications I have in
mind for microcontroller,s knowing exaclty how long they will take to
execute each instruction is vital. I guess there are still some for which
that is the case...
You say that as though it's a Good Thing.
Personally, I can't think of
any change in the last 20 years that's actually made life better for
me.
My favorite tool is an oscillograph, but I can't get the galvanometer
I once used an instrument that claimed to be a 'cathode ray oscillograph'...
to move at a couple hundred MHz. Those funny
"cathode ray"
variations are just too leading-edge for me.
Err, considering the CRT is about 106 years old now, I have little
problem with it.
Time marches on and I choose to march with it, but I acknowledge that
everyone is different.
I see no reason to change if what I am using is doing the job I require
it to do. And I certainly no reason to change if the modern replacement
is inferior (to me) that what it would replace.
So I stick to my statemet. I can't think of anything new in the last 20
years that's improved my life.
-tony