Objects of signed integral types can be converted to corresponding unsigned types. When these conversions occur, the actual bit pattern does not change; however, the interpretation of the data changes. Consider this code:
#include <iostream.h>
int main()
{
short i = -3;
unsigned short u;
cout << (u = i) << “\n”;
return 0;
}
The following output results:
65533
In the preceding example, a signed short, i, is defined and initialized to a negative number. The expression (u = i) causes i to be converted to an unsigned short prior to the assignment to u.