On 8/3/21 9:46 AM, ben via cctalk wrote:
Hardware makes software interesting, or is it the
other way around?
With C being developed on a PDP 11, you had no decimal operations,
but IBM had PL/I that did. Every thing was binary floating point
since then, until the latest standard of floating point for
hardware and software came out. Decimal is BACK Now things are more
confusing than ever with operating systems changing CPU's with the
latest marketing gimmick.
You don't need decimal hardware to do decimal arithmetic. CDC 6000 COBOL
killed IBM S/360 COBOL, even though the latter had hardware decimal
features and the former did not--the big CDC iron was never really sold
as a COBOL cruncher, even though it did quite well at it.
Using numbers in their 6-bit display code representation (33->44 octal),
it's a simple matter to perform 10 digit decimal addition and
subtraction in just a few instructions. I'll leave it as an exercise
to those who are curious (I'll give a hint that octal 25 25 25 25...
plays a part).
Also note that display "0' = 33 octal and display "9" = 44 octal, so
that nines' complement of a display number is the same as the ones'
complement, so subtraction follows quite naturally.
The CDC 6000 has only one addressing granularity--60 bit word. There's
no CPU hardware for handling bytes (6 or 8 bit). Yet character
manipulation isn't very difficult at all.
The wonders of RISC. Do a few things, but do them quickly.
--Chuck