简体   繁体   中英

node.js loop crashed immediately when insert bulk data to cassandra

I am trying to insert 1000000 data to cassandra with nodeJS. But the loop is crashed a little time later. Every time I cannot insert over 10000 record. Why the loop is crashed anybody help me.

Thanks.

My code looks like:

var helenus = require('helenus'),
  pool = new helenus.ConnectionPool({
    hosts      : ['localhost:9160'],
    keyspace   : 'twissandra',
    user       : '',
    password   : '',
    timeout    : 3000
  });

pool.on('error', function(err){ 
    console.error(err.name, err.message); 
    }); 
var i=0; 
pool.connect(function(err, keyspace){ 
    if(err){ throw(err); 
    } else { 
        while (i<1000000){ 
            i++; 
            var str="tkg" + i; 
            var pass="ktr" + i; 
            pool.cql("insert into users (username,password) VALUES (?,?)",[str, pass],function(err, results){
            }); 
            } 
            } 
        });
    console.log("end"); 

You're probably overloading the Cassandra queue by attempting to make a million requests all at once! Keep in mind the request is asynchronous, so it is made even if the previous one has not completed.

Try using async.eachLimit to limit it to 50-100 requests at a time. The actual maximum concurrent capacity changes based on the backend process.

Actually there was no problem. I checked the number of records twice at different times and i saw that the write operation continued until timeout value. The timeout value is given inside the code. As a summary in the code there is no crash, thank you Julian H. Lam for reply.

But another question is that how to increase write performance of cassandra? What should i change in cassandra.yaml file or any?

Thank you.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM