Mail Archives: djgpp/1999/05/13/06:05:24
Pasi Franti writes:
>> Ok. thanx. it is like here then:
>>
>> typedef unsigned short U16;
>> typedef unsigned long U32;
>> typedef unsigned char BYTE;
>
> I disagree.
>
> I did not follow your discussion but how did you come up to such
> conclusion? You can never be sure of how many bits are int and
> long types without checking it!
Which is exactly why you need to make these defines. The above code
is right for djgpp, along with most other 32 bit compilers. If you
want to port your code to some different platform, you just change
those few defines, rather than having to alter every reference to
the types in your entire sources. This is the only possible way to
do it, since you can't test type sizes with the preprocessor, or
choose between different types at runtime!
Personally, though, I've never much liked this method of defining
your own types. As long as you make some minimal assumptions (eg.
that you can fit at least 32 bits in an int, or assume at least 16
bits if you want to support 16 platforms as well), and don't rely
on any specific wrapping behaviour, I've never found a case where
I really needed this kind of define. IMHO it is almost always
better to let the compiler choose a good size for you, eg. if
you naively ported a 16 bit DOS program to djgpp by defining all
the integers as shorts, you'd end up with very inefficient code
because of all the size prefixes, wheras if you just said "int"
you would get whatever is the optimal integer datatype for the
current machine.
Shawn Hargreaves.
- Raw text -