From: Ned Ulbricht Newsgroups: comp.os.msdos.djgpp Subject: Re: printf 'g' conversion Date: Sun, 01 Mar 1998 16:54:31 -0800 Organization: University of Washington Lines: 31 Message-ID: <34FA0346.33@ee.washington.edu> References: NNTP-Posting-Host: cs236-16.student.washington.edu Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp Precedence: bulk Eli Zaretskii wrote: > > On Fri, 27 Feb 1998, Andrew Gibson wrote: > > > From Linux this is what I get. From DJGPP I get 9 decimal places and > > only 8 sig figs. > > I'm not sure that this is a bug. My ANSI C references seem to be not > definitive on this. Can someone with clearer understanding of what the > standard says clarify this? The Working Draft, 97-11-21, WG14/N794 J11/97-158, p.290 (&cf p.287), seems to be a little bit ambiguous about this, but it says under 'g,G' "the number is converted in style f or e (...), with the precision specifying the number of significant digits." I interpret the "with" as saying that the g,G format always applies its own semantics to precision irrespective of the "style" used for the output. I.e. it consistently takes over the meaning of precision. I can read K&RII either way, so no help there... Harbison & Steele (pre-ANSI reference for compiler writers) give a definitive algorithm for g,G which supports the behavior reported under Linux gcc. Also a couple of fairly recent MS-DOS compilers also generate that behavior. So, IMHO, it is a bug in djgpp (also I've confirmed the behavior on my libc.a installation as well). -- Ned Ulbricht mailto:nedu AT ee DOT washington DOT edu