Mail Archives: djgpp/1998/10/02/21:28:03
On 2 Oct 98 at 20:52, Uhfgood wrote:
> I have something wierd... Basically I made a small program that asked for the
> age of Methusela (a character in the bible who lived 969 years)... Anyhow it
> was supposed to demonstrate the atoi function, well it works but there's
> something peculiur... basically I read in the number into the character array
> years (as such char years[8]) and then i use age (which is in integer) like
> this ( age = atoi(years) )
> now i checked my code nothing wrong with the statements... and yes i did use
> an i not an l ... anyhow, when i run the program, it works fine, no problems
> but that's the point... I can type in up to 10 digits without displaying some
> other number... it's as if i had used a long and stead of an int, but it is for
> sure an int... Can someone explain why i'm able to type up to 10 digits with
> no problems.... I thought an integer could only hold up 32767... please reply
> soon, and i'm sorry if this is an annoying newbie question.
The sizes of integer types in C aren't specified exactly by the
ANSI standard -- there are restrictions but different compilers
can and do use different lengths for the variables.
DJGPP is a 32-bit environment, so `int' is 32 bits long -- the
maximum is just over 2,000,000,000 (when signed). `long' is
also this size; `short' is 16-bit. There's also `long long',
which is 64 bits wide, but it's not defined by the ANSI C
standard.
Incidentally though, you shouldn't type so many characters in
your program because your buffer is only 8 characters long.
--
george DOT foot AT merton DOT oxford DOT ac DOT uk
- Raw text -