Not so much related to classic hardware directly, but instead the
*documentation* for classic hardware *and* software:
OK, let's say I've got a couple thousand sheets of microfiche, 4" x 5".
Each sheet contains ~200 paper pages of text and drawings. I want to digitize
at least part of this, possibly OCR'ing it too.
Each 8.5" x 11" printed page on the microfiche is about 0.15" x 0.2",
so the magnification factor is about 50. That means if I want the equivalent
of a 75 DPI scan of the full-size version, that I need to scan the microfiche
at about 4000 DPI. The el-cheapo (i.e. a couple hundred $) scanners I see
on the shelves here seem to top out at 2400 DPI.
And 4000 DPI is the "minumum acceptable" number in my above calculation. If
I can do 4 times better than that, so much the better. In my experience
most 75 DPI scans of 8.5" x 11" text don't OCR well at all, you need more
resolution.
So what are my choices for higher-resolution scanners? My *other* hobby
happens to be large-format photography, so if the resulting scanner is also
good for 4" x 5" negatives and/or transparencies I won't complain :-).
It looks like there are 35mm film scanners with 2700 or 3000 DPI resolutions
available for a few thousand, but I think I need to do better than that.
Of course, I can go in the darkroom and enlarge the microfilm, but doing
that for each of the thousands of sheets is going to be tedious. Yeah, I
know, it's already a tedious job!
Finally, do *any* scanners have documented interfaces? i.e. say I find myself
a nice SCSI-connected high-speed high-resolution scanner. Am I going to be
reduced to point-and-drool with Windows 98, or can I actually hook the
scanner up to a real computer? We're talking about many tens or hundreds
of gigabytes of data here, so I'm willing to invest some effort to automate
the acquire/compress/archive process.
Tim.