简体   繁体   中英

Java binary multiplication using integer arrays not working

I'm making a program that accepts two decimal numbers and convert them into binary numbers, which are stored in integer arrays. Then I need to do multiplication using the two integer arrays. The result should also be a binary integer array (I need to validate that using a for loop). Then I convert them result to decimal number.

So far, I have the following code. My logic to convert the decimal number to binary works fine and vice verse. However, the binary result is always somehow smaller than the expected result. I have spent a lot of time on this, could you help me check what is wrong?

public class BinaryMultiplication {
public static void main(String[] args) {

    Scanner scanner = new Scanner(System.in);

    int num1 = scanner.nextInt();
    int num2 = scanner.nextInt();

    int[] binaryNum1 = toBinary(num1);
    int[] binaryNum2 = toBinary(num2);
    System.out.println("Expected result: " + num1 * num2);
    System.out.println("Decimal number 1: " + toDecimal(binaryNum1));
    System.out.println("Decimal number 2: " + toDecimal(binaryNum2));

    int[] resultBinaries = new int[100];

    for (int i = 0; i < resultBinaries.length; ++i) {
        resultBinaries[i] = 0;
    }

    for (int i = 0; binaryNum1[i] != -1; ++i) {
        for (int j = 0; binaryNum2[j] != -1; ++j) {
            resultBinaries[i + j] += binaryNum1[i] * binaryNum2[j] % 2;
            resultBinaries[i + j] %= 2;
        }
    }
    resultBinaries[99] = -1;

    for (int i = 0; resultBinaries[i] != -1; ++i) {
        if (resultBinaries[i] > 1) {
            System.out.println("The result is not a binary!!");
        }
    }

    System.out.println("Actual decimal result: " + toDecimal(resultBinaries));
}

public static int toDecimal(int[] binaryNum) {
    int result = 0;
    int factor = 1;
    for (int i = 0; binaryNum[i] != -1; ++i) {
        result += binaryNum[i] * factor;
        factor *= 2;
    }
    return result;
}

public static int[] toBinary(int num) {
    int[] binaries = new int[100];

    int index = 0;

    while (num > 0) {
        binaries[index++] = num % 2;
        num /= 2;
    }

    binaries[index] = -1;

    return binaries;
}
}

A sample input & output: ( the binary validation loop works fine)

45 67
Expected result: 3015
Decimal number 1: 45
Decimal number 2: 67
Actual decimal result: 2871
   for (int i = 0; binaryNum1[i] != -1; ++i) {  
        for (int j = 0; binaryNum2[j] != -1; ++j) {  
            resultBinaries[i + j] += binaryNum1[i] * binaryNum2[j] % 2;  
            resultBinaries[i + j] %= 2;  
        }  
    }

What happens when resultBinaries[i + j] increases to 2? It's reduced to 0 and then resultBinaries[i + j + 1] should be increased with 1, but this isn't happening in the code as far as I can see.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM