简体   繁体   中英

What happens when you cast a character that isn't on the ascii chart to type int in C++?

Does it have it's own value? Will certain characters crash the program?

With the assumption of sizeof(char) < sizeof(int) . And for the example below, let's assume 32-bit integer and a 2's complete arch.

Casting a signed char or an unsigned char to a signed integer expression will result in the same value being promoted to int.

Casting an unsigned char to an unsigned int expression will result in the same value being promoted to unsigned int.

Casting a signed char to an unsigned integer expression might have slightly different behavior based on how you think of casting. The signed char will get promoted to signed integer first - which includes getting "signed extended" to retain any negative value. Then casted to unsigned.

The latter case has an interesting side effect when casting and promoting negative values.

signed char c = 0x7f;  // 0x7f expressed as a signed char is 127
unsigned int n = c; // n gets 0x0000007f (127)


signed char c = 0xff;  // 0xff expressed as a signed char is -1
unsigned int n = c; // n gets 4294967295 (0xffffffff)

The reason why I explicitly call out "signed char" vs "unsigned char" above is because this variable:

char c;

Could be signed or unsigned depending on how the compiler decides to treat it. Most compilers default to signed, but have a compiler option to treat as unsigned.

I've had to fix a few bugs because code assumed the wrong sign-ness for char.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM