Chuck Guzis wrote:
It's an interesting question about how much
oversampling is
necessary.
Most modern 3.5" floppies are rated with an ISV of around +/- 3% and
CSV of about about half that. 5.25" legacy floppies, particularly
those with belt drives go considerably outside of that.
Data separators expect some pretty wide variations; the lowly WD9216
(used on a lot of older FDC boards) claims to be able to adjust to +/-
30% of nominal center (reference) frequency.
So how much oversampling is optimum in light of legacy hardware
performance and how much is gilding the lily?
The data separator has to deal with a lot more than just the variation
in drive speed. That variation has long-term and short-term components,
and it also has to deal with the bit shifting that isn't completely
avoided by write precompensation.
For many disks you can read adequately using the very simple software
data separator techniques that have been described previously in this
thread. However, for disks that are marginal in one way or another, the
simple techniques will fail. Radio Shack customers found this out the
hard way with the disk controller in the Model I expansion interface,
and that wasn't even doing MFM.
One of the advantages of sampling the pulses from the drive and saving
the samples or delta times is that you can use data separation
algorithms that would be impractical in a real floppy disk system. For
instance, you can use non-causal algorithms (adjusting your separator
based on future data in addition to past data), and based on information
derived from other tracks of the same disk. When I was experimenting
with this some years back, I found both techniques to be useful on
problem disks.
Eric