So basically here is what I have. The user gives an integer and I'm converting it into 3 bytes.
int number = 167123;
byte[0] = (int)number / 65536;
byte[1] = (int)number / 256;
byte[2] = (int)number;
stream.Position = 0x503;
stream.WriteByte((byte)byte[2]);
stream.WriteByte((byte)byte[1]);
stream.WriteByte((byte)byte[0]);
(Note: I'm cycling through the byte array backwards on purpose at the end.)
When I check the value later it works as intended. Now I'm looking hard at the code and trying the calculation by hand and I'm not getting the right answer. What am I doing wrong? How is this working? And what is Visual C# writing into the third byte when it casts 167123 as a "byte"?
The reason this works is because the assignment of the int
value to the byte truncates the value. This may be why your math isn't working out - you're not truncating.
Essentially what you're doing by dividing is bitshifting. Your code is the same as this:
byte[0] = (int)number >> 16;
byte[1] = (int)number >> 8;
byte[2] = (int)number;
To make your manual math work, do the math, then convert it to binary, and chop off anything above the last 8 digits. That's the number you're assigning to the byte array.
One example:
byte[1] = (int)number / 256;
This is 167123 / 256 = 652. In binary, this is 001010001100. Now, truncate everything above the size of a byte (8 bits), and you have 10001100, which is 140 in decimal. This is what is assigned to this byte array index.
Try to use this method instead: BitConvet.GetBytes(int) . More about this problem is also available in this question .
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.