I (C++ newbie) am currently trying to implement the following function:
std::string bytes_to_hex(const std::string &bytes);
The function should basically return a base16 encoding of a given byte array:
std::string input{0xde, 0xad, 0xbe, 0xef} => "deadbeef"
My first version doesn't quite work how I imagined:
std::string bytes_to_hex(const std::string &bytes) {
std::ostringstream ss;
ss << std::hex;
for (auto &c : bytes) {
ss << std::setfill('0') << std::setw(2) << +c;
}
return ss.str();
}
With this function the output is:
ffffffdeffffffadffffffbeffffffef
After some experiments, I've found out that this version looks better:
std::string bytes_to_hex(const std::string &bytes) {
std::ostringstream ss;
ss << std::hex;
for (const char &c : bytes) {
ss << std::setfill('0') << std::setw(2) << +(static_cast<uint8_t>(c));
}
return ss.str();
}
The output is as expected:
deadbeef
My question is:
As mentioned in my comment, the unary +
forces integer promotion . When that happens, signed types are sign extened which for two's complement encoded integers means that negative numbers (where the left-most bit is 1
) are left-padded with binary ones (ie 0xde
becomes 0xffffffde
).
Also mentioned is that char
can be either signed
or unsigned
, a decision that is up to the compiler. Because of the output you get we can say that in your case char
is actually signed char
.
The simple solution you found out is to first cast the character to an unsigned char
, and then (with the unary +
) promote it to int
.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.