www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/2000/09/12/09:14:51

From: Hans-Bernhard Broeker <broeker AT physik DOT rwth-aachen DOT de>
Newsgroups: comp.os.msdos.djgpp
Subject: Re: Malloc bug in DJGPP V2.03
Date: 12 Sep 2000 11:48:45 GMT
Organization: Aachen University of Technology (RWTH)
Lines: 28
Message-ID: <8pl56t$3v2$1@nets3.rz.RWTH-Aachen.DE>
References: <20000911155441 DOT A493 AT ajax DOT netspace DOT net DOT au>
NNTP-Posting-Host: acp3bf.physik.rwth-aachen.de
X-Trace: nets3.rz.RWTH-Aachen.DE 968759325 4066 137.226.32.75 (12 Sep 2000 11:48:45 GMT)
X-Complaints-To: abuse AT rwth-aachen DOT de
NNTP-Posting-Date: 12 Sep 2000 11:48:45 GMT
Originator: broeker@
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp
Reply-To: djgpp AT delorie DOT com

Andrew Apted <ajapted AT netspace DOT net DOT au> wrote:
> I'm experiencing a problem with my DJGPP program (glBSP, a nodes
> builder for DOOM) where it runs out of memory under DOS, even though
> there should be plenty left.  I've memory-profiled the code (under
> Linux), and the most it ever uses (not including overheads) is 9 MB,
> yet my machine has 64 MB RAM.  This is with DJGPP V2.03.

The strategy you use to do the allocation is prone of causing heap
fragmentation at an enormous rate. Using realloc() to increase memory
block sizes in as small steps as you do it leaves back a huge trail of
small-size blocks that are not available to be reused for larger
blocks, later. This might be causing a factor of 10 or even more in
loss of actually available memory to your program, and thus cause the
failure.

> This is clearly a bug in DJGPP's malloc functions, right ?

No. Your program just hit an inherent limitation in any implementation
of 'malloc()': it will sometimes fail to allocate a large block even
though the necessary amount of memory does exist, in the form of
previously free()d, smaller blocks.  Memory cannot be moved around
arbitrarily, so you cannot join 1000 previously free()d blocks of 512
bytes, each, to fulfill a request for 512 KB in a single chunk.
That's what 'heap fragmentation' means.

-- 
Hans-Bernhard Broeker (broeker AT physik DOT rwth-aachen DOT de)
Even if all the snow were burnt, ashes would remain.

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019