Mail Archives: djgpp/1996/12/03/08:12:25
G.P. Tootell wrote:
>
> ok, i had this bug in my code the other day which took ages to track down, and
> eventually it turned out to be this..
>
> i'm used to doing a=b=c=0; to set more than one variable at a time, and by error
> i had extended this analogy to if (a==b==c==0) { do stuff }
> no djgpp didn't throw up any errors about this even with -wall and indeed ran the
> compiled code fine. however the interpretation of this is not what i expected.
> instead of treating this as a==b && b==c && c==0, it treated it as a==b && c==0
> can anyone explain this behavior? should it not have been reported as a warning?
>
ok, i'll give this a try but i may be totally wrong.
here is what seems like a logical explanation to me. assume that the
compiler, when processing your code, uses a 'maximal munch' rule, i.e.,
it always considers the longest possible character sequence that is
meaningful in the context. so, now suppose the compiler is at the
beginning of a==b==c==0. what is the maximal meaningful substring? it is
a==b. what is the next one. it seems to be that's ==. what is the one
after that? well, c==0. so, a==b==c==0 gets interpreted as
(a==b) == (c==0) (which is not (a==b) && (c==0).)
this seems like a logical explanation to me. however, i won't be able to
test it for a couple of days. sorry for the unsubstantiated claims.
sinan.
- Raw text -