On 8/29/11 8:40 AM, William Donzelli wrote:
Also
sufficient: source code. Open source has an unconquerable advantage
here, over proprietary.
I do not have complete faith about this in the long term. The open
source folks tend to be a loose group of undisciplined volunteers,
generally not long term thinkers, with only a very small segment
interested in historic preservation.
Who toss legacy hardware support at an alarming rate because it's too much
work. Linux is an extreme example of this. NetBSD pretty much the opposite.
They also thrash on the 'fun' parts of systems (GNOME/KDE look and feel)
and don't bother with the dirty, un-fun parts of making systems stable
for the real world.
I don't know how far back they ever found it, but Stallman/FSF were hunting
for GNU distributions because they didn't have a complete archive of what
they had released.
I had looked at the problem of archiving open source projects for CHM and
couldn't get my head around what to consider important. Sourceforge was
filled with half-baked abandoned projects, and now I don't even know where
to start with the half-baked abandoned web sites trying to be Sourceforge.
I also don't agree that modern software systems are in a better state to
be preserved. In the 'old days' you could get a shrink-wrapped package that
was the entire deal, docs, and media. Today, it's a blob that comes off the
web and gets patched very frequently, as well as potentially being tied to
the mother-ship to be functional. There are dozens (hundreds?) of products
now that are devices that were tethered to dead back-end services
(Danger Sidekick). How do you even start to preserve this?
The direction the world is heading with computers is making it very, very
difficult to preserve.