Mail Archives: djgpp/1999/05/13/09:50:13
>>> typedef unsigned short U16;
>>> typedef unsigned long U32;
>>> typedef unsigned char BYTE;
> I understood that the above is for DJGPP, in which case it is
> correct. As I said elsewhere in this thread, for any other
> environment, you indeed need to find out how many bits does each type
> use.
I just wanted to know the reasoning behind this solution. And yes,
it is probably true for DJGPP as shown above. But in general, we aim
at portability beyond the DJGPP. No special needs currently but just
in case.
The second point I had for Eugene is that we already solved his
original problem long ago as he proposed the above solution to
be added our library. Our solution, as you might have guess,
uses any type, not EXACTLY 2 bytes, but AT LEAST two bytes,
and then use routines such as:
WriteIntegerToFile(file,value,bytes)
ReadIntegerFromFile(file,value,bytes)
This inputs/outputs the integer byte by byte, using the number
of bytes given the last parameter.
Another solution for making system-independent definition
for U32, for example, would be:
#include <limits.h>
#if (INT_MAX == 2147483647)
typedef int int32;
#elif (LONG_MAX == 2147483647)
typedef long int32;
#elif (SHRT_MAX == 2147483647)
typedef short int32;
#else
#error "No 32-bit type!"
#endif
Not very nice but should work in most cases.
- Raw text -