www.delorie.com/archives/browse.cgi   search  
Mail Archives: djgpp/1997/11/09/04:44:21

From: eyal DOT ben-david AT aks DOT com
To: dehacked72 AT hotmail DOT com
cc: djgpp AT delorie DOT com
Message-ID: <4225654A.0033DAED.00@aks.com>
Date: Sun, 9 Nov 1997 11:38:08 +0200
Subject: Re: Is it possible to set int=16bit
Mime-Version: 1.0



>
>The best thing to do is to use he keyword short. You may be able to do a
minor
>change.
>#define int short
>
No way !

Where to define such a thing ? Before system / compiler headers ? then you
change
the declarations of the compiler, and if something's working it's pure
luck!
OTOH if you #define this after compiler headers then your code is in
contradiction
with library codes (e.g how many bytes to push on the stack etc)

IMO the best way is to manually change your code. somrthing like:

     typedef short  uint16;
     typedef unsigned short  uint16;
     // etc

I always prefer to declare a special typedef when I need nn bit quantities
but you can use  plain short.

Eyal.


- Raw text -


  webmaster     delorie software   privacy  
  Copyright © 2019   by DJ Delorie     Updated Jul 2019