From: Swift Griggs
even though there is *more* overall documentation on
the Internet, the
docs you get with hardware and tools are nowhere near as good as they
were in the 80s AFAIK.
I think that's partially because the speed of product cycles has sped up;
there just isn't time to get good docs done and out. Also, there's more
competitiveness/'efficiency', and good documentation costs money/overhead.
Wonderful as e.g. DEC Technical Manuals were, I suspect producing their ilk
nowadays is simply beyond the industry's capabilities (in the organizational
sense, not the technical skill sense).
From: Liam Proven
C is popular because C is popular.
Yes, but that had to start somewhere.
I think it _became_ popular for two reasons: i) it was 'the' language of
Unix, and Unix was so much better than 99% of the alternatives _at the time_
that it grew like crazy, and ii) C was a lot better than many of the
alternatives _at the time it first appeared_ (for a number of reasons, which
I won't expand on unless there is interest).
direct memory allocation, pointer manipulation and so
on -- are
widespread /because/ of the C family influence. And I have a deep
suspicion that these are harmful things.
Pointers, yes. Allocation, not really - a lot of languages have heaps. Did
you mean manual freeing when you mention 'memory allocation', because
technically even something like CONS allocates memory? And one could consider
'auto' variables as 'allocated' - but the 'freeing' is automatic,
when the
routine returns.
As to whether those two are harmful - they can be. You have to have a 'clean'
programming style to use them extensively without running into problems. (I
personally have only rarely run into problems with them - and then mostly in
very early C, before casts existed, because of C's wierd pointer math rules.)
I would need to think about this for a while to really come up with a good
position, but my initial sense is that these two are perhaps things that
should be hidden in the depths of large systems, for use by very good
programmers, that 'average' programmers should only be using higher-level
constructs.
(E.g. build a 'system' of routines to manage a certain kind of objects -
like OO languages enforce in the language itself - and the average user
only calls foo_allocate(), etc.)
I actually hugely admire Linux
...
We are continuing to refine and tweak a 1970s-style OS -- a
technologically conservative monolithic Unix. FOSS Unix hasn't even
caught up with 1990s style Unix design yet, the early microkernel ones
.. It's roughly 2 decades behind the times.
I'm a bit puzzled by your first thought, given your follow-on (which I agree
with).
I'd go further in criticizing Linux (and basically all other Unix
descendants), though - your latter comment above is just about the
_implementation_. I have a problem with the _basic semantics_ - i.e. the
computational environment provided to user code. It was fantastic on a PDP-11
- near-'mainframe' levels of capability on a tiny 16-bit machine.
On modern machines... not so much.
Noel