简体   繁体   中英

This program doesn't work properly for decimals more than 10 digits?

The below code is used to count number of digit in a given decimal. The problem is that it doesn't count digits more than 10.

int NumDigits(int n) {
  int digits = 0;
  if (n <= 0) {
    n = -n;
    ++digits;
  }

  while (n) {
    n /= 10;
    ++digits;
  }

  return digits;
}

It seems like your toolchain has a 32-bit int type. The maximum value representable in such a type is 2 31 -1 , or 2,147,483,647. As you can see, that's a 10-digit number. You'll need to use a different type that supports larger numbers if you want to use this kind of an algorithm.

That's a 32 bit integer, which has a maximum amount of 2,147,483,647. You might want to look into using 64 bit integers or other solutions.

Try long for the argument type.

As Jonathan already stated in his comment an int cannot contain a number with more than 10 digits.

The reason is that your parameter is of the type int, which is limited in size. Most likely you have Int32 (you can find it out with sizeof(int) which gives you the size in byte) which goes up to 2147483647 and then overflows to the negative value -2147483648.

Example: 
int i = 2147483647;
i = i+1;
printf("%d\n",i);

gives you the output: "-2147483648"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM