George Currie wrote:
It works because C compilers treat array
subscripts as simple
pointer arithmetic,
It's not that C compilers have chosen to do that, but the C language
that *requires* it. My little program doesn't work by accident, but
is in fact a fully standard-conformant program.
so i["He...\n"] is the same as (i *
sizeof(char)) + "He...\n" (the
string is a converted to a pointer
internally).
Very close. You omitted the pointer dereference, and the
multiplication by sizeof(char) is incorrect, though it expresses your
intent. When you write
"Somestring" + i
The quoted literal is an array of elements of type char, which becomes
a pointer to char. The language requires that addition of integers
and pointers to someType to have an implicit scaling by
sizeof(someType). The multiplication by sizeof(char) thus happens
"under the hood". For a character, where sizeof(char) is 1, your
explicit multiplication won't cause a problem, but for any type with
sizeof(type) > 1, the extra multiplication would cause a problem.
In C, the expression
a[b]
is essentially a shorthand for
*(a+b)
Since addition is commutative (even between pointers and integers),
that is equivalent to
*(b+a)
which is, of course, equivalent to
b[a]
In the specific example,
i["He...\n"]
is equivalent to
*(i + "He...\n")
or
*("He...\n" + i)
or
"He...\n"[i]
Eric
I didn't mean for it to be syntactically correct, I was only giving the
basic gist of why it works the way it does (hence why no dereference and
explicitly stating sizeof(char)). I'm also aware that this isn't
undefined behavior or a side effect, I didn't say "might/can/likely
treat" or "some/many/most C compilers" :)
Unlike some of the other folks responding, I personally don't find this
heinous at all, as a matter of fact I think it's a great example of a
language where once you understand some of the basic underpinnings, it
becomes a fairly straight forward matter to suss out "odd" constructs
like the above.