If you explicitly compare two values of different types, the compiler normally catches the error. Some type mismatches aren't easy to spot, however, even for humans:
#define CHARVAL '\xff'
main()
{
unsigned char uc;
uc = CHARVAL;
if( uc == CHARVAL )
printf( "Eureka!" );
else
printf( "Oops..." );
}
The program prints Oops... which probably wasn't expected. The comparison between CHARVAL and uc is false even though both are clearly char values.
The answer lies in the way the compiler converts signed and unsigned char values into int values for internal use. The #define directive,
#define CHARVAL '\xff'
defines CHARVAL as the constant 0xff. Since no sign is specified, the compiler treats the constant as a signed char value by default. When it converts the char to an int for internal use, as it does all character values, the compiler extends the value's sign. The result is an int with the value 0xffff.
The variable uc undergoes the same internal conversion, with an important difference. Since uc is explicitly declared as unsigned, its value is converted to an int value of 0x00ff.
When the two int values are compared, the result is false (0xffff does not equal 0x00ff). One solution is to explicitly cast CHARVAL to the desired type:
#define CHARVAL (unsigned char)'\xff'
Now the compiler compares two unsigned char values, giving the desired result. Another solution is to make CHARVAL an int instead of a char constant:
#define CHARVAL 0xff
Both solutions give the desired result, although the second is slightly less efficient. It creates word-size, rather than byte-size, machine-code instructions.