简体   繁体   中英

How does node.js clusters, redis, single thread and non blocking i/o works?

I will try to sum up all of what I have recently learned about node configuration. If it is correct please confirm. If I messed this up please correct me and others.

Node.js is a server. As a server it can receive requests from clients. Lets say that there 200 requests in period of time.

Since node.js is a single thread it can handle all of those requests by one unit / with one core.

However node is also event non blocking i/o, which means that someone can requests some method in the node.js API that takes 1 minute to complete while other can requests another method which also takes 1 minute to complete. Both can be handled simustanly without the need of wait for each other to complete.

Now that sounds awesome but it can even be better if we assist the node server and let it work as a proxy spreading those tasks of the 200 requests to clusters/workers. Therefore the node server becomes a proxy between clients to workers. Like a project manager :)

The benefit of this, is that instead of one core processing 200 requests we now have 4 cores processing those requests and each of core is event non blocking i/o (depending on the server specifications ofcourse) .

Thats super fast probably. Isnt it?

And here is the problem / what I want to understand:

How to make those clusters/workers work as a shared?

I mean, if 50 requests transferred to core 1 and 50 requests transferred to core 2 and so for and so on until all of the available cores, how can I send a socket emit to both different cores? ( socket.io )

How can clusters run together with a rate limiter? I mean, rate limiter will ban a user who tries to spam or something, but then immediately revoke the ban of this user since it goes to a different cluster.

I have heard that the Redis suppose to give an help with this. But its very confusing, redis is a DB isnt it? I am working with MongoDB so why I need more DB?

I will put here some code maybe the solution is just right the corner:

var cluster = require('cluster');
var redis = require("socket.io-redis");

if(cluster.isMaster) {

    var numWorkers = require('os').cpus().length;
    console.log('Master cluster setting up ' + numWorkers + ' workers...');

    for(var i = 0; i < numWorkers; i++) {
        cluster.fork();
    }

    cluster.on('online', function(worker) {
        console.log('Worker ' + worker.process.pid + ' is online');
    });

    cluster.on('exit', function(worker, code, signal) {
        console.log('Worker ' + worker.process.pid + ' stopped with code: ' + code + ', and signal: ' + signal);
        console.log('Starting a new worker');
        cluster.fork();
    });

    cluster.fork();

} else {

  // define main required variables
  var express = require('express');
  var app = express();
  var cors = require('cors');
  var server = require('http').createServer(app);
  var io = require('socket.io')(server);
  var mongojs = require('mongojs');
  var db = mongojs("mongodb://user:pass*@ip:port/dbname", []);
  var ObjectId = require('mongodb').ObjectID;
  var CronJob = require('cron').CronJob;
  var request = require('request');
  var fcm = require('./routes/fcm');
  var Excel = require('exceljs');
  var fs = require('fs');
  var path = require('path');
  var rateLimit = require('express-rate-limit');

  // define some global limiter against spammer
  var limiter = new rateLimit({
    windowMs: 10*60*1000, // 10 minutes
    max: 100, // limit each IP to 100 requests per windowMs (11 is on load and more 100 later on)
    delayMs: 0, // disable delaying - full speed until the max limit is reached
    message: "Service is block for you, try again later"
  });

  //  apply to all requests
  app.use(limiter);

  // allow cross origin to access my api
  app.use(cors());

  var bodyParser = require('body-parser');
  app.use(bodyParser.json()); // support json encoded bodies
  app.use(bodyParser.urlencoded({ extended: true })); // support encoded bodies
  app.use(express.static(__dirname + '/src'));

  // my own routes
  require('./routes/sockets')(app, io, mongojs, db, ObjectId, CronJob, request, fcm);
  require('./routes/userServer')(app, io, mongojs, db, ObjectId, CronJob, request, fcm, rateLimit);
  require('./routes/ridesServer')(app, io, mongojs, db, ObjectId, CronJob, request);
  require('./routes/offersServer')(app, io, mongojs, db, ObjectId, CronJob, request, rateLimit);
  require('./routes/notificationsServer')(app, io, mongojs, db, ObjectId, CronJob, request);
  require('./routes/scopeServer')(app, io, mongojs, db, ObjectId, CronJob, request, fcm);
  require('./routes/excelServer')(app, io, mongojs, db, ObjectId, CronJob, request, fcm, Excel, fs, path);

  // listen
  server.listen("8080", function() {
    console.log("Connected to db and listening on port 8080");
  });

}

How to handle those things?

Thanks.

I have found the solution and here it is:

For socket.io with multiple clusters do this:

Step 1: Install Redis on your machine: brew install redis

Step 2: Run the Redis server: redis-server (from a new tab in the SHELL).

Step 3: Install the Redis dependency: npm install redis --save

Step 4: Include the Redis adapter after the definition of the io like this:

var io = require('socket.io')(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));

You are ready to go.

For a rate limiter using Redis please follow the next NPM Repo

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM