简体   繁体   中英

Javascript floating point number addition

PFB an example code snippet illustrating the issue:

        var x=0.323;
        var cumulativeVal = 0;

        for(i=0;i<30;i++){
                   cumulativeVal = cumulativeVal + x;
                   console.log(cumulativeVal);
        }

The result of the above computation is

  0.323
  0.646
  0.9690000000000001
  1.292
  1.615....
  4.845000000000001
  5.168000000000001
  5.491000000000001
  5.814000000000002....
  9.690000000000007

Note that an extra decimal value is getting added. I do understand that this has something to do with precision of values in javascript. But can anyone explain?

There's nothing particular to explain. IEEE-754 double-precision numbers are not completely, perfectly precise in decimal terms. Small errors can creep in. For full decimal precision (which, you should note, cannot perfectly represent one-third) you'd need to use a type designed for that. (JavaScript doesn't have one built in; examples from other languages would be BigDecimal from Java, or decimal from C#.)

There's an easier example, by the way:

0.1 + 0.2 = 0.30000000000000004

It's one of Crockford's favorites.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM