George Currie wrote:
It works because C compilers treat array subscripts as
simple pointer
arithmetic,
It's not that C compilers have chosen to do that, but the C language
that *requires* it. My little program doesn't work by accident, but is
in fact a fully standard-conformant program.
so i["He...\n"] is the same as (i *
sizeof(char)) + "He...\n" (the
string is a converted to a pointer
internally).
Very close. You omitted the pointer dereference, and the multiplication
by sizeof(char) is incorrect, though it expresses your intent. When you
write
"Somestring" + i
The quoted literal is an array of elements of type char, which becomes a
pointer to char. The language requires that addition of integers and
pointers to someType to have an implicit scaling by sizeof(someType).
The multiplication by sizeof(char) thus happens "under the hood". For a
character, where sizeof(char) is 1, your explicit multiplication won't
cause a problem, but for any type with sizeof(type) > 1, the extra
multiplication would cause a problem.
In C, the expression
a[b]
is essentially a shorthand for
*(a+b)
Since addition is commutative (even between pointers and integers), that
is equivalent to
*(b+a)
which is, of course, equivalent to
b[a]
In the specific example,
i["He...\n"]
is equivalent to
*(i + "He...\n")
or
*("He...\n" + i)
or
"He...\n"[i]
Eric