简体   繁体   中英

java: convert binary string to int

I'm trying to convert a couple of binary strings back to int. However it doesn't convert all my binary strings, leaving me a java.lang.NumberFormatException exception. Here is my test code with 3 binary string:

public class Bin {

    public static void main(String argvs[]) {
            String binaryString ;
            binaryString = Integer.toBinaryString(~0);
            //binaryString = Integer.toBinaryString(~1);
            //binaryString = "1010" ;
            int base = 2;
            int decimal = Integer.parseInt(binaryString, base);
            System.out.println("INPUT=" + binaryString + " decimal=" + decimal) ;
    }
}

If I convert the "1010" it works great, but when I try to convert one of the other two I get the exception. Can someone explain to me why this is ?

Cheers

As explained above, Integer.toBinaryString() converts ~0 and ~1 to unsigned int so they will exceed Integer.MAX_VALUE.

You could use long to parse and convert back to int as below.

int base = 2;
for (Integer num : new Integer[] {~0, ~1}) {
    String binaryString = Integer.toBinaryString(num);            
    Long decimal = Long.parseLong(binaryString, base);
    System.out.println("INPUT=" + binaryString + " decimal=" + decimal.intValue()) ;
}

From http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/Integer.html#toBinaryString(int) : the toBinaryString() method converts its input into the binary representation of the " unsigned integer value is the argument plus 2 32 if the argument is negative ".

From http://docs.oracle.com/javase/1.5.0/docs/api/java/lang/Integer.html#parseInt(java.lang.String,%20int) : the parseInt() method throws NumberFormatException if " The value represented by the string is not a value of type int ".

Note that both ~0 and ~1 are negative (-1 and -2 respectively), so will be converted to the binary representations of 2 32 -1 and 2 32 -2 respectively, neither of which can be represented in a value of type int , so causing the NumberFormatException that you are seeing.

The bits for "~0" are 11111111111111111111111111111111 (32 1's). Normally, this represents the number -1. The bits for "~1" are 11111111111111111111111111111110 (31 1's followed by a zero). Normally, this represents the number -2.

I tried "01111111111111111111111111111111" (a 0 and 31 1's), which represents the highest signed integer, in parseInt and there was no error. But I tried "10000000000000000000000000000000", which represents the minimum signed integer, and there was the error again.

The parseInt method seems to expect a "-" in the input to indicate that a negative number is desired. It looks like this method is detecting overflow in the integer and throwing the NumberFormatException .

Adding all four updated methods and comparisons here for easier understanding

public static void binary() {

    // Gives magnitude in binary along with sign for negative values
    System.out.println(Integer.toString(-1,2)); // -1
    // Give the int value of input binary representation along with sign
    System.out.println(Integer.parseInt(Integer.toString(-1, 2),2)); // -1

    // Gives signed binary representation of input value
    System.out.println(Integer.toBinaryString(-1)); // 11111111111111111111111111111111
    // Gives the integer value of signed binary number
    System.out.println(Integer.parseUnsignedInt(Integer.toBinaryString(-1), 2)); // -1
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM