简体   繁体   中英

How to convert a char array to a uint16_t by casting type pointer?

char bytes[2];
bytes[0] = 0; //0x00
bytes[1] = 24; //0x18
uint16_t* ptrU16 = (uint16_t*)bytes; // I expect it points to the memory block: 0x18
cout << *ptrU16 << endl;  // I expect 24, but it is 6144

What's wrong in my code?

You have a little endian machine. 6144 is 0x1800 . When your machine represents the 16 bit value 0x0018 in memory, it puts the 0x18 byte first, and the 0x00 byte second, so when you interpret the two byte sequence 0x0018 as a uint16_t , it gives you 6144 (ie 0x1800 ), and not 24 (ie 0x0018 ).

If you change to:

bytes[0] = 24; 
bytes[1] = 0;

you'll likely see the result you expect.

If you really want to get the result you expect, then you'll either have to compute it manually, such as:

uint16_t n = (bytes[1] << 8) + bytes[0];

or, more generally:

char bytes[] = {0x18, 0x00};
uint16_t n = 0;
for ( size_t i = 0; i < 2; ++i ) {
    n += bytes[i] << 8 * i;
}
std::cout << n << std::endl;

or you can use a function like ntohs() , since network byte-order is big endian.

You may want to look into the ntohs() function. ("Network to Host byte order conversion"). You've plugged in your data in big endian mode, which is traditionally also network byte order. No matter what host you're on, the ntohs() function should return the value you're expecting. There's a mirror function for going from host to network order.

#include <arpa/inet.h>
...
cout << htons(*ptrU16) << endl;

should work and be portable across systems. (ie should work on Power, ARM, X86, Alpha, etc).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM