简体   繁体   中英

I've created a code to convert binary to decimal, but doesn't work with more than 10 bits

I've created a small code to convert binary number to decimal number. When I enter a binary number until 10 bits, the result be correct, but when I increase than 10 bits, the result would be wrong.

The algorithm that I used is the following

1   1   0  0  1  0
32  16  8  4  2  1 x
------------------
32+ 16+ 0+ 0+ 2+ 0

The Code:

unsigned long binary, i=0, j=0, result=0, base=1;
unsigned char *binaryStandalone = (unsigned char *)malloc(16);
memset(binaryStandalone, 0, 16);

printf("Enter a binary number: ");
scanf("%u", &binary);


while(binary > 0){
   binaryStandalone[i] = binary % 10;
   binary = binary / 10;
   i++;
}

for(j=0;j<i;j++){
   result +=  (binaryStandalone[j] * 1 << j);
   printf("%u = %u\n", j, base << j);
}
printf("The decimal number is: %u\n", result);

free(binaryStandalone);

Now I want to know, what is the reason that the code doesn't give me the correct result when increase the binary number more than 10 bits ?

It seems that your platform uses 32 bit for a long int , therefore your binary variable can hold at most the value 2^32 - 1 = 4294967295 , which is sufficient for 10 digits, but not for eleven.

You could use unsigned long long instead (64 bit would be sufficient for 20 digits), or read the input as a string.

您存储在无符号长范围为0到4,294,967,295->仅10个数字的数字中

Because the long value you're using to store the "binary" value has not more decimal digits. You might want to use a string type for input instead.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM