www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1998/06/17/08:30:54

From: sassi AT biomed DOT polimi DOT it (Roberto Sassi)
Newsgroups: comp.os.msdos.djgpp
Subject: A question about atof() and the double 0.1
Date: Wed, 17 Jun 1998 11:18:40 GMT
Organization: Politecnico di Milano - Centro Informatico di Ateneo
Lines: 60
Message-ID: <3587a411.12016463@news.polimi.it>
NNTP-Posting-Host: pccaiani.bioing.polimi.it
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

Hi,
This is the first time I write to this newsgroup, so be patient for my
silly question and for my bad english.
I was used with gcc an HP or SUN workstation, but yestarday I have
downloaded and installed from Delorie the djggp for intel machines.
Trying it, I have found a strange behaviour of the atof() function, so
that it convert a sting containing "0.1", in a double number with some
decimal in the last positions.
This is the source:

#include <stdlib.h>
#include <stdio.h>

int main(void) {

 char rr[10];
 double r=0.0;
 double inc=0.05;

 rr[0]='0';
 rr[1]='.';
 rr[2]='1';
 rr[3]='\0';

 r= (double) atof(rr);
 if(r==((double)0.1)) printf("OKAY\n");
 else printf("ERROR\n");
 printf("%40.38f\n",r);

 r+=inc;
 if(r<=0.15) printf("OKAY\n");
 else printf("ERROR\n");

 rr[0]='0';
 rr[1]='.';
 rr[2]='2';
 rr[3]='5';
 rr[4]='\0';

 r= (double) atof(rr);
 if(r==((double)0.25)) printf("OKAY\n");
 else printf("ERROR\n");
 printf("%40.38f\n",r);
 }

And this id the DOS output:

OKAY
0.10000000000000000555111512312578270212
ERROR
OKAY
0.25000000000000000000000000000000000000

As you can see, with the number 0.25, there is no problem.
Can You help me in understandig such a behaviour?
Thank in advance!
Roberto Sassi
sassi AT biomed DOT polimi DOT it


- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019