Usually you know the encoding scheme, and I dont see
anything wrong
Do you? I don't think I know the exact details of the encoding scheme for
half of my computers, and I have schematics for the lot (and service
manuals for quite a lot of them).
with using whatever knowledge you have of the
interface to optimize it. If a
different encoding scheme is used, an FPGA based design could easily adapt.
Only if you really understnad the encoding scheme. And do you really want
to have to come up with a new FPGA configuration file for every computer,
including ones you've never seen?
But you're right, its probably possible to undo write-precomp by
simulating the drives tendency to push close transitions apart. I do think
some adjustments are needed per drive type... The write Precomp value is
pretty small - in the order of 5-10 % of the data period.
On the older controllers, precompensation was typically done using a
delay line (probably 10ns per tap) and a 74153 4-input mux (of which only 3
inputs were used). The 2 contol inputs to the mux was 'early' and 'late'
(both deasserted left the bit at the nominal timing position, asserting
one of the signals moved it appropriately).
The early/late signals generally came from the controller IC if there was
one. Or maybe from a little state machine based on the data bitstream.
The PERQ 2T2 DIB (Disk Interface Board) has a PAL for this, but alas all
the DIBs I have are for drives that do internal precompenastion, and the
PAL never asserts either signal. So I don't have the details of the state
machine...
Of course the delay line and mux could be replicated in the emulator to
move the bits back again (all bits are delayed by the same total time,
which doesn't really matter, if the controller makes a bit 'late' then we
pick if up of an ealier tap of the delay line in the emulator, and vice
versa).
-tony