Changing Default int to 32 BitsLast reviewed: July 17, 1997Article ID: Q38730 |
5.10 6.00 6.00a 6.00ax 7.00 | 5.10 6.00 6.00a | 1.00 1.50
MS-DOS | OS/2 | WINDOWSkbprg
The information in this article applies to:
In Microsoft C versions 5.1, 6.0, 6.0a, 6.0ax, C/C++ version 7.0, and Visual C++ there is no compiler switch that changes the default int to long rather than short. Including a "typedef long int;" or a "#define int long" in each and every module you compile, will resolve some ofthe problems. However, redefining the identifier "int" may cause severe and difficult-to-find problems. Microsoft emphatically does NOT recommend it. Note: K & R and ANSI are both very clear that int could be any size, provided that it's at least 16 bits. It is bad coding practice to rely on 32-bit ints because it makes porting code difficult. Changing all int variables to long causes your program to run very slowly because whenever it does arithmetic, it will have to do slower 32-bit arithmetic rather than the more efficient built-in 16-bit arithmetic. This situation is true even on 80386 processors because current versions of Microsoft's compilers do not support the generation of code that takes advantage of the 80386's 32-bit registers. A better strategy is to change to long only the variables that need to be long, which will avoid many unintended side effects, and prevent unnecessary 32-bit arithmetic.
|
Additional reference words: kbinf 1.00 1.50 5.10 6.00 6.00a 6.00ax 7.00
© 1998 Microsoft Corporation. All rights reserved. Terms of Use. |