At 11:43 PM 1/5/2005, you wrote:
And the programmers were not smart enough to figure a
way around this
because...why?
It's been a while, but MS-DOS derived its programming models from CP/M,
not Unix. If it's just expecting ASCII, you can use a meta-character
(like CTRL/Z) in the stream to know when to stop (or CTRL/S CTRL/Q to
pause and resume a handshaked char stream, for that matter), or you
need a programming model that includes a way to send the notion of
"end of file" outside the band of the simple stream of bytes (as well
as handling other flow-of-control outside of band).
Low-level DOS didn't have a low-level model like that at first. Maybe
it was added later. Maybe it never was. Certainly DOS, as the shell,
needed to maintain compatibility with the original method. Perhaps
someone else can describe if or when it was added to later models
of character devices in DOS or Windows, and whether it shows up
in the shell. (I do remember using several Unix-like DOS command
shell replacements to get around cruft like this. To this day,
a Cygwin shell resides on my Win2000 desktop.)
Sure, Unix is very good at copying binary or ASCII, and that's where your
expectation comes from. When you "cat >file", you enter a meta-char
like CTRL/D to tell the shell you want to stop entering chars to the file
(you're at end of file), and it sends the proper out-of-band message
to "cat", as opposed to sending it the actual CTRL/D.
It sends it through a function that receives more info than just
bytes. For example, the heart of a simplified "cat" program looks like:
while( (c = getc( rptr )) != EOF ) {
putchar( c );
}
where EOF is an int, usually -1, not a valid char. In English,
"read a char, write the char, until EOF."
I don't pretend to be a Unix or low-level MS-DOS expert who can
quote chapter and verse, but I think this is the gist of it.
- John