Mail Archives: djgpp/1996/04/25/03:41:32
>As for the 3rd question, I'm not sure what the ANSI spec says about the type
>of the *_MIN/*_MAX constants, I think the smallest type constants are usually
>represented in is int, so if int is 16 bits wide, the first definition may
>cause the same problem as LONG_MIN, with 32 bit ints it works anyway.
>So if they have a definition that works with both modes, they can use that
>one and don't have to clobber up their headers with additional #ifdefs.
>(I would expect that bc has similar optimizations for evaluating constant
>expressions as gcc).
I'd go for the 'saving the #ifdef's' theory, as additional #ifdefs would slow
the compiler down far more than the subtraction.
>BTW: I have no idea if bcc in 32 bit mode has 32 bit ints (I think the mode
>is called 32 bit because of the address space), but I guess it would be a
>good idea.
bcc int's *are* 32-bit in 32-bit mode, so the SHRT_MIN subtraction definition
is not strictly necessary.
- Raw text -