www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp-workers/1997/12/18/08:40:45

Date: Thu, 18 Dec 1997 08:39:09 -0500 (EST)
Message-Id: <199712181339.IAA11366@delorie.com>
From: DJ Delorie <dj AT delorie DOT com>
To: Vik DOT Heyndrickx AT rug DOT ac DOT be
CC: molnarl AT cdata DOT tvnet DOT hu, djgpp-workers AT delorie DOT com
In-reply-to: <3498D668.3B38@rug.ac.be> (message from Vik Heyndrickx on Thu, 18
Dec 1997 08:53:12 +0100)
Subject: Re: char != unsigned char... sometimes, sigh

> > > Since getc is only supposed to be used for text files, I think we should
> > > change it to return chars in the range [-128..127], so that comparisons
> > > work.
> > 
> > I think this is a bad idea, because most programs assume that EOF is -1.
> 
> I meant 'return ints in the range [-128..127] or -1'. Do I always have
> to be *this* exact? I think it was obvious what I meant to say.

Changing getc to return char instead of int makes DJGPP different from
other compilers, which return int even when chars are unsigned.  Given
how long compilers have been doing this, I think it would be a mistake
to make this change, no matter how well-intentioned it may be.

Returning char instead of int also makes it *impossible* to reliably
detect EOF when reading binary files (without using feof()).  Programs
expect -1 to mean EOF, and forcing them to use feof() would mean extra
work to port most programs - work which would not be needed if we just
leave getc() alone.

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019