"Cory
Doctorow tells us that '[i]n 2007, John Goerzen scraped every gopher
site he could find (gopher was a menu-driven text-only precursor to the Web;
I got my first online gig programming gopher sites). He saved 780,000
documents, totalling 40GB. Today, most of this is offline, so he's making
That's heroic and all, but how is a 2007 gophergrab(tm) at all
representative? I'm surprised there was anything at all left at that
point (aside from retro-sites.)
Actually, there were still some larger academic sites still up then. I
archived a few myself (
userserve.ucsd.edu was particularly nostalgic for
me since it was a little SE/30 with a big disk in AP&M, and I managed to
archive it before it was decommissioned -- I used it as an undergrad).
--
------------------------------------ personal:
http://www.cameronkaiser.com/ --
Cameron Kaiser * Floodgap Systems *
www.floodgap.com * ckaiser at
floodgap.com
-- Seen on hand dryer: "Push button for a message from your congressman." -----