On Thu, May 19, 2005 at 10:43:09AM -0700, Vintage Computer Festival wrote:
See Dwight's last reply re: archivist standards.
Putting stuff in a ZIP
file is NOT archiving.
For me, archiving has always been about preserving information. It
has had nothing to do with the transport mechanism of that
information. ZIP is a transport mechanism; an encoding of
information, just like an ASCII file is an encoding of information.
If I have an old hardware manual, and I scan it, OCR it
(accurately!) to text, format the text in the same way it is typeset
in the book (ie. to preserve tables, equations, etc.) and do so via
HTML, and put it on the web -- did I not just archive the book? If
all of the information is available, what's the big deal? What is
lost in the translation?
I see people
on the thread complaining about having to bundle a
windows emulator with each archive. Excuse me? Let's look at some
popular formats: TAR, ZIP, RAR all have source-code unarchivers.
Which means they can run on any machine with a C compiler. So
what's with all the paranoia? Just use whatever works as long as
more than one major platform can extract it.
For now. What about 1 year from now? 5 years? 10 years? 50 years? 100
years? 500 years?
Think LONGTERM.
LONGTERM the information will have been translated to new
media/medium by then. When you want to read the constitution of the
US, do you travel all the way to Washington, D.C. or do you look it
up online? Is the online version any less properly archived?
If LONGTERM is truly a concern then why is the information being
stored as ASCII data files at all? I can think of a lot more
durable information transport mechanisms than hard disks or magnetic
tape...
(Not directed
at Sellam, just commenting on all thread participants.)
Ain't no thang ;)
:-)
--
Jim Leonard
http://www.oldskool.org/ Email: trixter at
oldskool.org
Like PC games? Help support the MobyGames database:
http://www.mobygames.com/
Or taste a slice of the demoscene at
http://www.mindcandydvd.com/