On 9/28/2021 2:15 PM, ben via cctalk wrote:
On 2021-09-28 11:43 a.m., Vincent Long standing via
cctalk wrote:
The C standards are more liberal, and continue to
require char types
to be 8 or more bits.
Was PL/I the only language that would let you select data
size for
variables? Of course the fine print would not let you have more than 16
decimal digits, or 32 bit binary. You would think by now that a language
could handle any length data.
Hardly.
FORTRAN: INTEGER*4 INTEGER*8 (and sometimes INTEGER*2 - e.g. Sun
FORTRAN-77) was common, though never adopted as a standard. Also REAL
vs. DOUBLE.
COBOL: COMP-1 and COMP-2 for floating point: single and double precision.
Pascal, Ada: specified range of values. How that actually got
implemented would be implementation dependent.
And more, I'm sure.
As for any length of data, that becomes a cost/return question. Adding
that capability would create no end of headaches for language
implementation, so it isn't done. Instead, if one actually needed that,
one would define a type (with methods) or a set of functions to
accomplish that - and no doubt many exist out on the 'net.
JRJ