I seem to remember a VERY short instruction sequence which
converted a binary value between 0 and 99 Decimal (between
0 and 143 octal) into two Decimal Ascii Characters, each between
"0" and "9". However, I can't remember the actual instructions.
Maybe I am confusing the memory with the following three instruction
sequence from Billy Y... and if so, please ignore my request.
sub #'9+1 ,r0 ; convert ascii byte
add #9.+1 ,r0 ; to an integer
bcc 20$ ; not a number
Jerome Fine