From: "A. Sinan Unur" Newsgroups: comp.os.msdos.djgpp Subject: Re: Data types Date: Tue, 20 Jan 1998 21:28:40 -0500 Organization: Cornell University (http://www.cornell.edu/) Lines: 45 Sender: asu1 AT cornell DOT edu (Verified) Message-ID: <34C55D58.DCE814CD@cornell.edu> References: <34C53551 DOT D297940B AT jet DOT es> <199801210107 DOT UAA22018 AT p2 DOT acadia DOT net> NNTP-Posting-Host: cu-dialup-0071.cit.cornell.edu Mime-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit To: djgpp AT delorie DOT com DJ-Gateway: from newsgroup comp.os.msdos.djgpp Precedence: bulk Scott Warner wrote: > > something like > > #include > > int main(void) > { > int i; > char c; > float f; > > printf("Size of integer: %d\n", sizeof(i); > printf("Size of char: %d\n", sizeof(c); > printf("Size of float: %d\n", sizeof(f); > > return 0; > } > you don't need to declare any variables. sizeof(type) is a constant expression of size_t. incidentally, did you try to compile the above with -Wall? printf("Size of integer: %lu\n", sizeof(int); printf("Size of char: %lu\n", sizeof(char); printf("Size of float: %lu\n", sizeof(float); etc. is fine. > (and so on) is the general idea. I believe sizeof(int) is equivelant > to sizeof(i) in the above. WORD and DWORD (and the like) are most > likely preprocessor macros that you can find if you look carefully > through the header files. people (e.g. Ralph Brown's interrupt list) generally use WORD to mean 16-bits and DWORD to mean 32-bits, contradicting the convention that a machine word is that computer's natural int size. -- ---------------------------------------------------------------------- A. Sinan Unur Department of Policy Analysis and Management, College of Human Ecology, Cornell University, Ithaca, NY 14853, USA mailto:sinan DOT unur AT cornell DOT edu http://www.people.cornell.edu/pages/asu1/