简体   繁体   中英

What happens when you cast an int to an int*?

int val{ 100 };
int* ptr1 = (int*)val;
int* ptr2 = ptr1 + 5;
std::cout << ptr2 << '\n' << (int)ptr2 << std::endl;

In this code example the result of (int*)val is 00000064 , but I don't understand why. I also don't understand why (int)ptr2 is 120 .

Analyzing line per line:

int* ptr1 = (int*)val;

Assign the decimal value 100 to the pointer ptr1 ;

int* ptr2 = ptr1 + 5;

This instuction invokes undefined behaviour , the algebric operation over pointer is allowed only in array context.

std::cout << ptr2 << '\n' << (int)ptr2 << std::endl;

This instruction can print everything due to the previous undefined behaviour.

The the result of (int*)val is 00000064 because is the representation of decimal value 100 in hexadecimal notation

Step by step:

int* ptr1 = (int*)val;

After executing this instruction ptr1 has decimal value 100 (00000064 in hexadecimal).

int* ptr2 = ptr1 + 5;

Now ptr2 has the same memory adress of ptr1, shifted by 5 units. It is shifted by 5 * (4 bytes) = 20 bytes. The memory address represented by ptr2 is (00000078) This is the reason why (int)ptr2 is 120 (100 + 20).

Can this reasoning make sense?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM