简体   繁体   中英

How to read a 16-bit number from a file in C#?

I have a requirement which involves some level of bit operations. I have no experience of bit operations.

From the file byte I have to pick the fifth and sixth byte.

在此处输入图像描述

and have to calculate the value from this formula, which is explained like this for the given example

byte 5 - LSB, byte 6 MSB. In example offset will be 598 (LSB * (MSB << 8))

How did they get 598 for the given example?

I know LSB means least significant bit and MSB is most significant bit.

So is 8 the LSB of 5th byte and 0 the MSB of sixth byte?

Or do I have to convert 98 and 05 to bits and then have to find out these two values?

I want to code this formula in C#.

You aren't going to need to deal with individual bits. These are hexadecimal values, so 98 is a single byte and 05 is a single byte.

It also looks like there is an error in the conversion. You need to add the LSB to the shifted MSB, not multiply. 598 = 0x98 + (0x05 << 8). The <<8 operator is shifting your MSB over by one byte (same as multiplying it by 256).

The byte at index position 5 is: 98 (hex)
The byte at index position 6 is: 05 (hex)

In which case the result is 0598 or 589 (hex)

If you do the math in C# you'll likely be seeing the result as an integer, which is 1432.

MSB also stands for Most significant byte .

    var data = new byte[] { 0xA9, 0x43, 0x50, 0x49, 0x00, 0x98, 0x05, 0x28 };

    ushort value = BitConverter.ToUInt16(data, 5);

    Console.WriteLine(value.ToString("X4"));

BitConverter.ToUInt16 will read two bytes from data array starting at index 5 , ie 0x98 and 0x05 . Then it will apply formula to calculate the result which is 0x0598 .

The formula depends on CPU architecture Endianness

BitConverter.ToUInt16

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM