简体   繁体   中英

Delphi Double to Objective C double

I am looking a few hours for some solution of this problem, but I don't get how it works. I have a hex string from delphi double value : 0X3FF0000000000000. That value should be 1.0. It is 8 byte long, first bit is sign, next 11 are exponent and the rest is mantissa. So for me is this hex value equals 0 x 10^(1023). Maybe I am wrong somewhere, but it doesn't matter. The point is, I need this hex value to convert into objective c double value. If I do: (double)strtoll(hexString.UTF8String, NULL,16); I get: 4.607...x 10 ^18. What am I doing wrong?

It seems that trying to cast in this way ends up with a call to an implicit type conversion (calls _ultod3 or _ltod3 ) that alters the underlying data. In fact, even trying to do this seems to do the same thing :

UINT64 temp1 = strtoull(hexString, NULL, 16);
double val = *&temp1;

But if you cast the uint pointer to a double* it semes to suppress the compiler's desire to try to perform a conversion. Something like this should work :

UINT64 temp1 = strtoull(hexString, NULL, 16);
double val = *(double*)&temp1;

At least this works with the MS C++ compiler... I imagine the objective C compiler would cooperate as well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM