-----Original Message-----
From: cctalk <cctalk-bounces at classiccmp.org> On Behalf Of Van Snyder via
cctalk
Sent: 28 September 2021 23:34
To: cctalk at
classiccmp.org
Subject: Re: Linux and the 'clssic' computing world
On Tue, 2021-09-28 at 17:03 -0500, Jay Jaeger via cctalk wrote:
On
2021-09-28 11:43 a.m., Vincent Long standing via cctalk wrote:
The C standards are more liberal, and continue to
require char
types to be 8 or more bits.
Was PL/I the only language that would let you select
data size for
variables? Of course the fine print would not let you have more than
16 decimal digits, or 32 bit binary. You would think by now that a
language could handle any length data.
Hardly.
FORTRAN: INTEGER*4 INTEGER*8 (and sometimes INTEGER*2 - e.g. Sun
FORTRAN-77) was common, though never adopted as a standard. Also
REAL vs. DOUBLE.
Fortran 90 introduced "kind type parameters" for all types. For REAL, one can
use SELECTED_REAL_KIND to ask for a specific number of decimal digits. The
standard does not require any minimum number be required.
Both "default real" and double precision are required. Many processors
provide quad precision. For INTEGER, one can use SELECTED_INT_KIND.
Processors are required to provide at least one kind with at least 18 decimal
digits. There is no specification which other sizes are required.
REXX has had this ability from the start. It only does decimal arithmetic, but you can set
the number of numeric digits used to whatever you want
Dave