Computing from 1976

dwight dkelvey at hotmail.com
Mon Jan 1 11:26:10 CST 2018


One other thing that larger/faster becomes a problem. That is probability!

We think of computers always making discrete steps. This is not always true. The processors run so fast that different areas, even using the same clock, have enough skew that the data has to be treated as asynchronous. Transferring asynchronous information it always a probability issue. It used to be that 1 part in 2^30 was such a large number, it could be ignored.

Parts often use ECC to account for this but that just works if the lost is recoverable ( not always so ).

Even the flipflops have a probability of loosing their value. Flops that are expected to be required to hold state for a long time are designed differently that flops that are only used for transient data.

All these probabilities add up. We are expecting more and more that no mistake will happen when we add more and more data and code to deal with it. What fail safes do we have when things are so complex that multiple failures are expected to happen.

Dwight

________________________________
From: cctalk <cctalk-bounces at classiccmp.org> on behalf of Noel Chiappa via cctalk <cctalk at classiccmp.org>
Sent: Monday, January 1, 2018 5:15:15 AM
To: cctalk at classiccmp.org
Cc: jnc at mercury.lcs.mit.edu
Subject: Re: Computing from 1976

    > From: Peter Corlett

    > since we have computers with multiple gigabytes of RAM, it makes little
    > sense to restrain one's use of them to a fraction of the capabilities,
    > except as an intellectual exercise.

For data, sure. (It's amazing how big even images can become, as the
resolution is increased. And that's not even video!)

For code, however, there are very good reasons to think that 'more is _NOT_
better'. More code means more complexity, and that has a host of Really Bad
consequences: harder to understand, more likely to have bugs, etc, etc.

It also often means that unless you have the Latest and Greatest hardware,
the machine is simply too slow to run the latest bloatware. The machine I'm
typing this on has a 1.4GHz single-core CPU, and _most_ things run on it just
fine - but going to many Web sites is now painful, since the 'obligatory'
HTTPS (another hot button, one I'll refrain from hitting right now, to keep
the length of this down) makes even simple Web operations slow.

        Noel


More information about the cctalk mailing list