formatting and checking floppy disks for bad sectors

Fred Cisin cisin at xenosoft.com
Sat Jul 30 21:56:18 CDT 2016


On Sun, 31 Jul 2016, Maciej W. Rozycki wrote:
> From my experience with floppy formats and reasonably fast computers
> (i.e. where CPU processing latency doesn't really matter) the best results
> are obtained with no interleaving, no sector staggering on head switching,
> and single-sector staggering on cylinder incrementing.
> . . . 
> A seek command requesting the next cylinder however requires the step
> motor to physically move the head assembly and it takes enough time to
> complete for medium to have already rotated past the next physical sector.
> So if that happened to be the logical sector requested with the next read
> (or write) command as well, then almost a full rotation would be required
> to even start executing the data transfer.  So to avoid that missed sector
> you need to stagger sectors back by one on cylinder increments, so that
> the physical sector lost due to the head movement is actually one you may
> need last and the subsequent one is the logically next one.

There is also an issue of drive.
The Qumetrak 142 (360K), which IBM used briefly, was so slow stepping that 
IBM lengthened the default track to track time in PC-DOS 2.10 (V 2.00).
With that drive, you would want TWO sectors staggered for optimum.
Similarly, an SA400 with 256 (or 128) byte sectors.

BUT, there are also issues of application software.  Consider the case of 
a machine, or group of machines, that are used consistently with one 
particular program.  If that program reads a record, processes it, then 
reads the next one, then the length of time for that processing could 
result in a different interleave being more effective, FOR THAT SPECIFIC 
application.   Surely, you have seen specific machines and their floppies 
that are always used for one particular program, particularly a 
visi-clone, a word processor, a database program, or some "vertical" 
program for that business.  Not all programs read the entire file into 
RAM.   With 16K - 64K of RAM, handling non-trivial file sizes required 
reading part of the file at a time.
Remember when maximum datafile size changed, for the most part, from 
disk size to RAM size?

So, what is optimum for you, might not be what is optimum for somebody 
with a weirder use.

--
Grumpy Ol' Fred     		cisin at xenosoft.com


More information about the cctech mailing list