简体   繁体   中英

Sum primes via Sieve of Eratosthenes in Javascript: Fails only if argument is prime

My implementation of the sieve itself seems to be working fine, and the summing function returns the correct result as long as the last value is not itself a prime number. Oddly, I can see the primality is duly noted in the true/false array if I return it directly, but I can't seem to actually get at it for the purposes of summing. As a result, running this Sieve on 10 returns 17 (correct), but running it on 37 returns 160 instead of 197. Running it on 5 returns 5 instead of 10, and so forth.

function sumPrimes(n) {
  var primArr = [];
  var primSum = 0;
  for (var i = 2; i < n; i++) {
    primArr[i] = true;
  }

  //sieve

  for (i = 2; i * i < n; i++) {
    if (primArr[i]){
      for (var j = 0; i * i + i * j < n; j++) {
        primArr[i * i + i * j] = false;
      }
    }
  }

  for (i = 2; i <= n; i++) {
    if (primArr[i]) {
      primSum += i;
    }
  }

  return primSum;
}

In all your for loops put the condition <= n , since you want to consider n itself as well.

Note that you save some calculations if you change the middle part to this:

for (var i = 2, sqrtN = Math.sqrt(n); i <= sqrtN; i++) {
    if (primArr[i]){
        for (var j = i * i; j <= n; j += i) {
            primArr[j] = false;
        }
    }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM