简体   繁体   English

Javascript在Python中给出了相同算法的不同答案

[英]Javascript is giving a different answer to same algorithm in Python

I'm working on the Rosalind problem Mortal Fibonacci Rabbits and the website keeps telling me my answer is wrong when I use my algorithm written JavaScript. 我正在研究Rosalind问题Mortal Fibonacci Rabbits ,当我使用我的算法编写JavaScript时,网站一直告诉我我的答案是错误的。 When I use the same algorithm in Python I get a different (and correct) answer. 当我在Python中使用相同的算法时,我会得到一个不同的(和正确的)答案。

The inconsistency only happens when the result gets large. 只有在结果变大时才会出现不一致。 For example fibd(90, 19) returns 2870048561233730600 in JavaScript but in Python I get 2870048561233731259 . 例如, fibd(90, 19) 2870048561233730600在JavaScript中返回2870048561233730600但在Python中我得到2870048561233731259

Is there something about numbers in JavaScript that give me a different answer or am making a subtle mistake in my JavaScript code? 有没有关于JavaScript中的数字给我一个不同的答案或在我的JavaScript代码中犯了一个微妙的错误?

The JavaScript solution: JavaScript解决方案:

function fibd(n, m) {
    // Create an array of length m and set all elements to 0
    var rp = new Array(m);
    rp = rp.map(function(e) { return 0; });
    rp[0] = 1;

    for (var i = 1; i < n; i++) {
        // prepend the sum of all elements from 1 to the end of the array
        rp.splice(0, 0, rp.reduce(function (e, s) { return s + e; }) - rp[0]);
        // Remove the final element
        rp.pop();
    }

    // Sum up all the elements
    return rp.reduce(function (e, s) { return s + e; });
}

The Python solution: Python解决方案:

def fibd(n, m):
    # Create an array of length m and set all elements to 0
    rp = [0] * m
    rp[0] = 1

    for i in range(n-1):
        # The sum of all elements from 1 the end and dropping the final element
        rp = [sum(rp[1:])] + rp[:-1]

    return sum(rp)

I think Javascript only has a "Number" datatype, and this actually an IEEE double under the hood. 我认为Javascript只有一个“数字”数据类型,这实际上是一个IEEE双引擎。 2,870,048,561,233,730,600 is too large to hold precisely in IEEE double, so it is approximated. 2,870,048,561,233,730,600太大而无法精确地保持IEEE双倍,因此它是近似的。 (Notice the trailing "00" - 17 decimal places is about right for double.) (请注意尾随“00” - 小数点后17位左右。)

Python on the other hand has bignum support, and will quite cheerfully deal with 4096 bit integers (for those that play around with cryptographic algorithms, this is a huge boon). 另一方面,Python支持bignum,并且会非常高兴地处理4096位整数(对于那些使用加密算法的人来说,这是一个巨大的好处)。

You might 威力 will be able to find a Javascript bignum library if you search - for example http://silentmatt.com/biginteger/ 如果你搜索,将能够找到一个Javascript bignum库 - 例如http://silentmatt.com/biginteger/

Just doing a bit of research, this article seems interesting. 只是做了一些研究,这篇文章似乎很有趣。 Javascript only supports 53bits integers. Javascript仅支持53位整数。

The result given by Python is indeed out of the maximum safe range for JS. Python给出的结果确实超出了JS的最大安全范围。 If you try to do 如果你试图这样做

parseInt('2870048561233731259')

It will indeed return 它确实会回归

2870048561233731000

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM