简体   繁体   中英

Not able to understand javascript behaviour

I am trying to execute the following code

var n = 100;
var sum =0;
while(n>0)
{
sum = sum + n%10;
n = n/10;
}
console.log(sum);

The result should be 1 but javascript is returning

1.1111111111111112

Also, when I ran individual statements it is giving perfectly fine result, what could be the possible reason?

The result it is giving is correct -

Lets go into while loop

For first iteration n=100 so n%10 = 0 and sum = 0, after first loop sum = 0 and n = 10

In second iteration n%10 = 0 and sum = 0, after second iteration sum = 0 and n = 1

For third iteration n%10 = 1 and sum =0, after third iteration sum = 1 and n = 0.1

For fourth iteration n%10 = 0.1 and sum = 1, after fourth iteration sum = 1.1 and n = 0.01

and so on so forth

finally the answer would be 1.1111111111....... tending to infinity so it becomes 1.1111111111111112

Hope this helps

EDIT

If you want final answer as 1, intialise n as 0.9(n = 0.9)

No, the result won't be 1. Notice that your loop is dividing n by 10 every iteration, and that the continuing condition is n > 0 , which will take many iterations till it happens:

  • n=100, n%10=0
  • n=10, n%10=0
  • n=1, n%10=1
  • n=0.1, n%10=0.1
  • n=0.01, n%10=0.01
  • ...

So, the result is correct: 1.1111111... The 2 at the end is no more than a rounding decimal error (minimal).

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM