www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1997/09/27/01:16:01

From: "J.E." <cellis AT voyageur DOT ca>
Newsgroups: comp.os.msdos.djgpp
Subject: 16 to 32-bit
Date: Fri, 26 Sep 1997 21:18:07 -0500
Organization: Bell Network Solutions
Lines: 52
Message-ID: <342C6CDE.34A6@voyageur.ca>
Reply-To: cellis AT voyageur DOT ca
NNTP-Posting-Host: 207.236.8.53
Mime-Version: 1.0
To: djgpp AT delorie DOT com
DJ-Gateway: from newsgroup comp.os.msdos.djgpp

I'm trying to make a delay function for my game, but the one I'm going
by is meant for a 16-bit compiler.  Here's the function:

------------------------------------------------------------------------
#include <stdio.h>
#include <conio.h>

void delay(int ticks);

int main()
{
  int total = 0;
  int ticks;
  clrscr();
  printf("Enter the number of clock cycles to delay: ");
  scanf("%d", &ticks);
  while (kbhit() == 0)
  {
    delay(ticks);
    total++;
    printf("\n  Tick --> %d", total);
  }
  return 0;
}

void delay(int ticks)
{
  unsigned long far *clock = (unsigned long far *)0x0000046CL;
  unsigned long now;
  now = *clock;
  while (abs(*clock - now) < ticks) {}
}
------------------------------------------------------------------------

Now, how do I convert this to 32-bit compatability for DJGPP?  Do I just
remove the "far"'s in the clock pointer declarations and then change the
pointer?  If so, what should the new pointer be?  A far pointer is just
<segment><offset>, right?  So, shouldn't that pointer also work in a
32-bit environment like Djgpp?  Thanks in advance for all your time and
help!!:-)

Jordan Ellis   <cellis AT voyageur DOT ca>

C:\DOS
C:\DOS\RUN
RUN\DOS\RUN

*nix version:

C:/BIN
C:/BIN/RUN
RUN/BIN/RUN

- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019