From: "J.E." Newsgroups: comp.os.msdos.djgpp Subject: 16 to 32-bit Date: Fri, 26 Sep 1997 21:18:07 -0500 Organization: Bell Network Solutions Lines: 52 Message-ID: <342C6CDE.34A6@voyageur.ca> Reply-To: cellis AT voyageur DOT ca NNTP-Posting-Host: 207.236.8.53 Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp Precedence: bulk I'm trying to make a delay function for my game, but the one I'm going by is meant for a 16-bit compiler. Here's the function: ------------------------------------------------------------------------ #include #include void delay(int ticks); int main() { int total = 0; int ticks; clrscr(); printf("Enter the number of clock cycles to delay: "); scanf("%d", &ticks); while (kbhit() == 0) { delay(ticks); total++; printf("\n Tick --> %d", total); } return 0; } void delay(int ticks) { unsigned long far *clock = (unsigned long far *)0x0000046CL; unsigned long now; now = *clock; while (abs(*clock - now) < ticks) {} } ------------------------------------------------------------------------ Now, how do I convert this to 32-bit compatability for DJGPP? Do I just remove the "far"'s in the clock pointer declarations and then change the pointer? If so, what should the new pointer be? A far pointer is just , right? So, shouldn't that pointer also work in a 32-bit environment like Djgpp? Thanks in advance for all your time and help!!:-) Jordan Ellis C:\DOS C:\DOS\RUN RUN\DOS\RUN *nix version: C:/BIN C:/BIN/RUN RUN/BIN/RUN