From: kagel AT quasar DOT bloomberg DOT com Date: Tue, 18 Feb 1997 16:01:28 -0500 Message-Id: <9702182101.AA02135@quasar.bloomberg.com > To: nikki AT gameboutique DOT co Cc: djgpp AT delorie DOT com In-Reply-To: <5e4dji$99q@flex.uunet.pipex.com> (nikki@gameboutique.co) Subject: Re: weird gcc thing Reply-To: kagel AT dg1 DOT bloomberg DOT com Errors-To: postmaster AT ns1 From: nikki AT gameboutique DOT co (nikki) Newsgroups: comp.os.msdos.djgpp Date: 15 Feb 1997 13:27:14 GMT Organization: GameBoutique Ltd. Lines: 28 Nntp-Posting-Host: www.gameboutique.com Mime-Version: 1.0 X-Newsreader: knews 0.9.8 Dj-Gateway: from newsgroup comp.os.msdos.djgpp Content-Type: text/plain; charset=us-ascii Content-Length: 647 is there any particular reason why : double wib; wib= (double) atof(argv[2]); or float wib; int a; wib= (float) a; should both give wrong values under djgpp? only these both work fine in a program i've been porting from unix, yet the atof one will give wildly wrong answers. eg if i made the argument '8.0' it gets evaluated as wib=456246264624525.00 or something ridiculous. removing that casting makes it work of course (and yes i know the casting is pointless here but it was in the program i was converting :) just wondered why this casting should cause a problem. Do you have a prototype for atof() in your source or did you include stdlib.h? Without the prototype atof is asssumed to return an int hence the screwed up result when you explicitly cast that int to a double. Try adding either an explicit prototype of atof() [double atof( const char * )] or including stdlib.h. If your UNIX compiler is pre-ANSI that would explain why it works there, too. -- Art S. Kagel, kagel AT quasar DOT bloomberg DOT com A proverb is no proverb to you 'till life has illustrated it. -- John Keats