On Jul 29, 2014, at 4:15 PM, Chuck Guzis <cclist at sydex.com> wrote:
STAR had 3-address BCD instructions with a 16-bit
length specifier for the byte length. At 2 digits per byte, that came to 128K decimal
places. I'd meant to try a divide of a 128K digit decimal by a 64K digit one just to
see how long it would take. Never did, though. On the 1B, it might well have amounted to
starting the instruction before going home for the day and checking back in the morning.
If only the 1B could actually have stayed up that long?
I wonder how the equivalent test on an IBM 1620 would behave. No length field there, just
an end of operand marker, so you could make numbers as big as can fit in memory (60k
digits in a max configuration).
Even on SIMH that might take a while.
paul