简体   繁体   中英

int x; scanf() with %d and printf() with %c

int x; So there will be 2 bytes memory for the variable. Now, if I entered 66 and because scanf() with %d, 66 will be stored in 2 bytes memory because the variable is declared int.

Now in printf() with %c, should collect data from only one byte memory to display.

But %c displayed correctly B by getting correct data 66 from memory to display.

Why it %c has not just get data from one byte?

%c expects an int argument, due to the default argument promotions for vararg functions. In other words, all of the following are exactly equivalent:

int x = 66;
char y = 66;
printf("%c", x);         // A
printf("%c", (char)x);   // B
printf("%c", y);         // C
printf("%c", (int)y);    // D

So all that's happening is printf is interpreting the int value of 66 as an ASCII code 1 and printing the corresponding character.


1. Note that ASCII is technically an implementation-defined design decision. Just an overwhelmingly common one.

The %c conversion specifier in a printf() statement expects an int argument. Further, since printf() is a variadic function, a char is converted to an int by virtue of the default argument promotions .

The int argument that is passed to printf() corresponding to a %c specifier is then converted to an unsigned char by printf() before printing. Note that the conversion of a signed integer type to an unsigned integer type is well-defined in C , and does not involve collecting "data from only one byte." Rather, if the new type can hold the original value, the value remains unchanged; otherwise one larger than the maximum value for the new type is added to (or subtracted from) the old ( signed ) value. For example, an int value of -1 would be converted to an unsigned char value (assuming UCHAR_MAX is 255) of -1 + 256 , or 255 , which is within the range of an unsigned char .

Note that an int with a value of 66 would be unchanged in the conversion to unsigned char , since 66 is well within the range of an unsigned char . UCHAR_MAX must be a minimum of 255.

Regardless of how the argument is passed, the %c format specifier always converts its argument to unsigned char before printing it. So, %c always prints one byte.

Your assertion that %c gets its data from more than one byte is unfounded. The example presented does not show any evidence to the contrary - 66 is a number that fits into one byte.

The intricacies of variadic argument passing (yes, it is passed as an int ) have no bearing on the observed behavior in this case.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM