简体   繁体   中英

Cast a 3-byte array to int32

quick question,
I want to cast a 3-bytes signed two's complement array into a little-endian int32
I tried setting the size of the memcpy to 3 and +1 to the destination address (without forgetting to zero initialize my int32)
Here is my code:

int main(void) 
{
  // little endian 3 bytes
  __uint8_t n1[]={0,0xFF,0xDE,0x81};
  __uint8_t n2[]={0xFF,0xDE,0x81};
  
  // temp var
  __int32_t tmp1=0;
  __int32_t tmp2=0;
  
  // cast array to int
  memcpy(&tmp1,n1,4);
  memcpy((&tmp2)+1,n2,3);
  
  // printf
  printf("n1 : %d\n",tmp1);
  printf("n2 : %d\n",tmp2);
}

The output I get:

n1 : -2122195201
n2 : 0

The output I want:

n1 : -2122195201
n2 : -2122195201

How do I fix this? Perhaps it is better to use union?

By doing (&tmp2)+1 you take the adress of tmp2 , which is a pointer to int32_t , and you increment this pointer. So you add 4 to its adress. This means that the destination adress of your memcpy is the adress of tmp2 + 4. You actually didn't modify tmp2 but instead overlap some other variable or memory.

It's not a clean way to convert or cast data, but if you really want to use memcpy do something like (not tested) memcpy((void *)(((char *)&tmp2)+1),n2,3);

I'd avoid casting the value, and make the conversion explicit (this will probably be inlined)

I am assuming that your character-array is little-endian, too. (otherwise swap the indexes)


static int three2four(__uint8_t three[3] )
{
int val;
val = ((unsigned)three[2] << 16) | ((unsigned)three[1] << 8) | (unsigned)three[0] ;
if (three[2] & 0x80) val |= (0xffu << 24);

return val;
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM