On Apr 3, 2016, at 12:19 AM, Tomasz Konojacki <me
at xenu.pl> wrote:
On Sat, 2 Apr 2016 18:58:10 -0400 (EDT)
Mouse <mouse at Rodents-Montreal.ORG> wrote:
He's assuming the "the entire address
space is a single
array of bytes (perhaps with holes)" memory model is the only possible
one. He needs to talk with someone who wrote large-model 8086 code -
or someone who's used the Lisp Machine C compiler I heard of that
represents pointers as <array,index> pairs
Indeed, intel segmented memory model was weird. Near pointers were
uncomparable between the segments, but it wasn't that unintuitive.
Far pointers were insanity-inducing, though. Since there were multiple
ways to represent the same address as a far pointer, there was completely
no point in doing any comparisons (unless you normalized them, of course).
Thankfully, huge pointers behaved exactly as one would expect, i.e. just
like pointers work in the protected mode.
There we have the issue. Often when people speak of what they "expect"
you're looking at an assumption about a C program that the standard does not allow you
to make. It's "expectations" of how particular compilers have worked in the
past, how they have compiled programs that are in fact "undefined", that gets
you in trouble. When a better compiler (with more powerful optimization) breaks the
program, the compiler is blamed rather than the programmer who made the incorrect
assumption.
Ideally compilers would flag all undefined programs, but in practice they do not. I'm
not sure that in C such a thing is even possible; C is not what you would call a sanely
designed language.
This paper