Message-Id: <199810030127.VAA08982@delorie.com> Comments: Authenticated sender is From: "George Foot" To: uhfgood AT aol DOT com (Uhfgood) Date: Sat, 3 Oct 1998 02:25:19 +0000 MIME-Version: 1.0 Content-type: text/plain; charset=US-ASCII Content-transfer-encoding: 7BIT Subject: Re: Using the atoi function... Reply-to: mert0407 AT sable DOT ox DOT ac DOT uk CC: djgpp AT delorie DOT com X-mailer: Pegasus Mail for Win32 (v2.42a) On 2 Oct 98 at 20:52, Uhfgood wrote: > I have something wierd... Basically I made a small program that asked for the > age of Methusela (a character in the bible who lived 969 years)... Anyhow it > was supposed to demonstrate the atoi function, well it works but there's > something peculiur... basically I read in the number into the character array > years (as such char years[8]) and then i use age (which is in integer) like > this ( age = atoi(years) ) > now i checked my code nothing wrong with the statements... and yes i did use > an i not an l ... anyhow, when i run the program, it works fine, no problems > but that's the point... I can type in up to 10 digits without displaying some > other number... it's as if i had used a long and stead of an int, but it is for > sure an int... Can someone explain why i'm able to type up to 10 digits with > no problems.... I thought an integer could only hold up 32767... please reply > soon, and i'm sorry if this is an annoying newbie question. The sizes of integer types in C aren't specified exactly by the ANSI standard -- there are restrictions but different compilers can and do use different lengths for the variables. DJGPP is a 32-bit environment, so `int' is 32 bits long -- the maximum is just over 2,000,000,000 (when signed). `long' is also this size; `short' is 16-bit. There's also `long long', which is 64 bits wide, but it's not defined by the ANSI C standard. Incidentally though, you shouldn't type so many characters in your program because your buffer is only 8 characters long. -- george DOT foot AT merton DOT oxford DOT ac DOT uk