Mail Archives: djgpp/1997/07/13/13:37:55
Eric Liao wrote:
>
> Hmm? 32 bit ints are different? That might be the problem.
> I used GDB and got an 8 digit or so number, and I was wondering how
> that could fit in an int. I always thought an int was 2 bytes on any
> compiler. Is a char still one byte? What are the other types? Maybe
> I should go look them up. Thanks for the help.
ANSI defines int as the native word size of the compiler, and rules that
sizeof(short) <= sizeof(int) <= sizeof(long). It says that short must
be at least 16 bits and long must be at least 32 bits. It allows the
implementation to define the actual size of these types. Every Unix
system uses 32-bit integers; 16-bit ints are a DOS peculiarity that
DJGPP fortunately does not share. ;) If you write a program that
depends on having 16 or 32 bit numbers, you should use short and long
explicitly.
ANSI defines char as the size needed to store one character in the host
operating system, with a minimum size of one byte. Again, that's not
really specific, but you can assume it's one byte on any rational
system.
If you want to know the sizes of all the data types, just use a program
like this:
/* sizes.c */
#include <stdio.h>
int main( void )
{
printf( "Type sizes for this compiler:\n"
" char = %ld short = %ld\n"
" int = %ld long = %ld\n"
" float = %ld double = %ld\n"
" long double = %ld char * = %ld\n",
sizeof(char), sizeof(short), sizeof(int), sizeof(long),
sizeof(float), sizeof(double), sizeof(long double),
sizeof(char *) );
return 0;
}
Compile this on each compiler you use and run it to see the type sizes.
--
---------------------------------------------------------------------
| John M. Aldrich |"Men rarely (if ever) manage to dream |
| aka Fighteer I |up a god superior to themselves. Most |
| mailto:fighteer AT cs DOT com |gods have the manners and morals of a |
| http://www.cs.com/fighteer |spoiled child." - Lazarus Long |
---------------------------------------------------------------------
- Raw text -