Could someone explain me why the following code:
#include <iostream>
#include <bitset>
int main()
{
unsigned char i = 2;
std::cout<<std::bitset<8>((~static_cast<unsigned char>(0)) << i)<<std::endl;
std::cout<<std::bitset<8>((~static_cast<unsigned char>(0)) >> i)<<std::endl;
return 0;
}
Produces:
11111100
11111111
and not:
11111100
00111111
Before ~
is done static_cast<unsigned char>(0)
is converted to int
(integer promotion happens), so after ~
it becomes all-one bits int
. This then is shifted and truncated to 8 bits in bitset.
On right-shifts, signed values are zero-filled on if the most significant bit is 0, and one-filled if the most significant bit is 1.
Using unsigned values forces zero-filling on right shifts.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.