www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp-workers/2001/08/08/10:28:30

Date: Wed, 8 Aug 2001 07:27:01 -0700 (PDT)
From: Paul Eggert <eggert AT twinsun DOT com>
Message-Id: <200108081427.f78ER1m28155@sic.twinsun.com>
To: eliz AT is DOT elta DOT co DOT il
CC: rich AT phekda DOT freeserve DOT co DOT uk, djgpp-workers AT delorie DOT com
In-reply-to: <Pine.SUN.3.91.1010808095610.23412D-100000@is>
(eliz AT is DOT elta DOT co DOT il)
Subject: Re: GNU ls bug on DJGPP with large files
References: <Pine DOT SUN DOT 3 DOT 91 DOT 1010808095610 DOT 23412D-100000 AT is>
Reply-To: djgpp-workers AT delorie DOT com
Errors-To: nobody AT delorie DOT com
X-Mailing-List: djgpp-workers AT delorie DOT com
X-Unsubscribes-To: listserv AT delorie DOT com

> Date: Wed, 8 Aug 2001 09:57:15 +0300 (IDT)
> From: Eli Zaretskii <eliz AT is DOT elta DOT co DOT il>
> 
> FAT32 volumes don't support files larger than 4GB anyway, at least for
> the system calls that DJGPP programs can use

That sounds like a real problem to me.  I regularly deal with files
larger than that these days.  I can buy 60GB disk drives for less than
US$2/GB.  (DJGPP can't deal with disk files that cost only $8?  Ouch...)

> to me this sounds like using a sledgehammer
> where a simple, if somewhat kludgey, type-cast in a single program would 
> have put this issue to rest.

But it doesn't put it to rest, for two reasons.

First, the type cast isn't portable to other hosts.  For portable
code, you need to use a patch like this instead (which I've already
sent to Jim Meyering, so I've removed him from the CC list):

--- ls.c-bak	Fri Aug  3 15:47:58 2001
+++ ls.c	Mon Aug  6 15:38:31 2001
@@ -2637,8 +2637,17 @@ print_long_format (const struct fileinfo
   else
     {
       char hbuf[LONGEST_HUMAN_READABLE + 1];
+      uintmax_t size = f->stat.st_size;
+
+      /* POSIX requires that the size be printed without a sign, even
+	 when negative.  Assume the typical case where negative sizes
+	 are actually positive values that have wrapped around.  */
+      if (sizeof f->stat.st_size < sizeof size)
+	size += ((uintmax_t) (f->stat.st_size < 0)
+		 << (sizeof f->stat.st_size * CHAR_BIT));
+
       sprintf (p, "%8s ",
-	       human_readable ((uintmax_t) f->stat.st_size, hbuf, 1,
+	       human_readable (size, hbuf, 1,
 			       output_block_size < 0 ? output_block_size : 1));
     }
 

Second, this patch fixes the problem only for GNU 'ls' (which is a
special case because of the POSIX requirement), and it fixes it only
for files smaller than 4 GB.  The patch doesn't fix the problem for
gzip, tar, find, or any of the dozens of other programs that have to
deal with large files in normal use.  And files larger than 4 GB are
still mishandled.

> Introducing a new large-file feature, and a
> compile-time feature on top of that (which means you need 2 versions
> of every library and elaborate configury stuff to choose the right one
> when you build packages)

It's fine with me if DJGPP makes an incompatible change and simply
changes off_t from 32 bits to 64 bits.  That wouldn't hurt portable code.

(In short, I think the DJGPP maintainers need to break out the
sledgehammers!  :-)

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019