www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1996/04/25/03:41:32

Xref: news2.mv.net comp.os.msdos.djgpp:3074
From: mcantos AT ocean DOT com DOT au (Marcelo Cantos)
Newsgroups: comp.os.msdos.djgpp
Subject: Re: LONG_MIN question
Date: Wed, 24 Apr 96 14:14:02 GMT
Organization: (private)
Lines: 18
Message-ID: <4llgfk$n93@news.mel.aone.net.au>
References: <01I3RZ7ZOPMQ005280 AT cc DOT uab DOT es> <4lg9m3$1cod AT rs18 DOT hrz DOT th-darmstadt DOT de>
NNTP-Posting-Host: 203.12.234.172
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

>As for the 3rd question, I'm not sure what the ANSI spec says about the type
>of the *_MIN/*_MAX constants, I think the smallest type constants are usually
>represented in is int, so if int is 16 bits wide, the first definition may
>cause the same problem as LONG_MIN, with 32 bit ints it works anyway.
>So if they have a definition that works with both modes, they can use that
>one and don't have to clobber up their headers with additional #ifdefs.
>(I would expect that bc has similar optimizations for evaluating constant
>expressions as gcc).

I'd go for the 'saving the #ifdef's' theory, as additional #ifdefs would slow 
the compiler down far more than the subtraction.

>BTW: I have no idea if bcc in 32 bit mode has 32 bit ints (I think the mode
>is called 32 bit because of the address space), but I guess it would be a
>good idea.

bcc int's *are* 32-bit in 32-bit mode, so the SHRT_MIN subtraction definition 
is not strictly necessary.

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019