简体   繁体   中英

Scaling socket.io between servers

What are the approaches for scaling socket.io applications? I see the following problem which I don't understand how solve:

  • How can a scaled socket.io app broadcast to a room? In other words, how will socket.io know about the neighbors from other servers?

It is hard for me to imagine how it should work -- maybe a shared variant store for all the necessary information, such as redis -- is this a possibility?

EDIT: I found this article: http://www.ranu.com.ar/2011/11/redisstore-and-rooms-with-socketio.html

Based on it I did the following:

   var pub = redis.createClient();  
   var sub = redis.createClient();
   var store = redis.createClient();
   pub.auth("pass");
   sub.auth("pass");
   store.auth("pass");

    io.configure( function(){
io.enable('browser client minification');  // send minified client
io.enable('browser client etag');          // apply etag caching logic based on version number
    io.enable('browser client gzip');          // gzip the file
io.set('log level', 1);                    // reduce logging
io.set('transports', [                     // enable all transports (optional if you want flashsocket)
    'websocket'
  , 'flashsocket'
  , 'htmlfile'
  , 'xhr-polling'
  , 'jsonp-polling'
]);
var RedisStore = require('socket.io/lib/stores/redis');
io.set('store', new RedisStore({redisPub:pub, redisSub:sub, redisClient:store}));
    });

But I get the following error:

      Error: Uncaught, unspecified 'error' event.
     at RedisClient.emit (events.js:50:15)
     at Command.callback (/home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/index.js:232:29)
     at RedisClient.return_error (/home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/index.js:382:25)
     at RedisReplyParser.<anonymous> (/home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/index.js:78:14)
     at RedisReplyParser.emit (events.js:67:17)
     at RedisReplyParser.send_error (    /home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/lib/parser/javascript.js:265:14)
     at RedisReplyParser.execute (/home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/lib/parser/javascript.js:124:22)
     at RedisClient.on_data (/home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/index.js:358:27)
     at Socket.<anonymous> (/home/qwe/chat/io2/node_modules/socket.io/node_modules/redis/index.js:93:14)
     at Socket.emit (events.js:67:17)

My Redis credentials are definitely correct.

EDIT: Very strange, but with Redis authorization disabled then everything works. So the question is still valid. Additionally, I have a question about how to get information (for example user name) for all the participants of a group (room) in this RedisStorage mode, is it possible to implement this? Ideally this can be done through the Redis Pub/Sub functionality.

你可以使用socket.io集群来工作https://github.com/muchmala/socket.io-cluster

Try adding this code in;

pub.on('error', function (err) {
  console.error('pub', err.stack);
});
sub.on('error', function (err) {
  console.error('sub', err.stack);
});
store.on('error', function (err) {
  console.error('store', err.stack);
});

It won't fix it, but it should at least give you a more useful error.

I suggested you to not using RedisStore. It has a problem with CPU usage because of its poor use of pub-sub which result in unscalable (It can receive load less than one pure node.js instance with socket.io which is pretty useless). I personally used Redis as a data store to keep room list in there and implement my own room function (Redis is a key-value database in memory but has persistent mechanic). When you want a room data, just fetch data from same redis and that's it. However, to be able to run Socket.io in multiple instance, you also need a load balancer like HAProxy, Nginx to separate works to multiple node.js port or else, your user will still use only just one node.js process. This is a huge work. If you also has other web frontend in other language, that's more work too because some network block all port except port 80 and 443. You can read more information about these things at:

http://book.mixu.net/node/ch13.html

Another possible solution is to use an alternative like PubNub to scale real-time interaction. I came across a similar problem when developing Mote.io and decided to go with a hosted solution instead of building a load balancer. I now work for PubNub.

PubNub will take care of the datasync problem you're talking about. Normally you would need to sync redis across servers or load balance your clients to the same instance to make sure they get all the same messages. PubNub abstracts this so you don't need to worry about it.

Real-time Chat Apps in 10 Lines of Code

在此输入图像描述

Enter Chat and press enter
<div><input id=input placeholder=you-chat-here /></div>

Chat Output
<div id=box></div>

<script src=http://cdn.pubnub.com/pubnub.min.js></script>
<script>(function(){
var box = PUBNUB.$('box'), input = PUBNUB.$('input'), channel = 'chat';
PUBNUB.subscribe({
    channel  : channel,
    callback : function(text) { box.innerHTML = (''+text).replace( /[<>]/g, '' ) + '<br>' + box.innerHTML }
});
PUBNUB.bind( 'keyup', input, function(e) {
    (e.keyCode || e.charCode) === 13 && PUBNUB.publish({
        channel : channel, message : input.value, x : (input.value='')
    })
} )
})()</script>

Using RabbitMQ

I achieved socket.io application scaling using rabbitMQ. In my current setup, I run two replicas of a socket.io application containers in docker swarm and communicate with them. Here's the demonstration with container ID shown with each message: 在此输入图像描述

How to

RabbitMQ is a message broker and basically, it syncs all the instances of the application backend. Each instance of the backend pushes its message to a queue on rabbitMQ which is consumed by all other instances. RabbitMQ handler in NodeJS is given below.

function rabbitHandler(io){
  rabbitMQHandler('amqp://test_rabbit', function(err, options){

    if(err){
      throw err;  
    }

    options.onMessageReceived = onMessageReceived;

    io.on('connection', websocketConnect);

    function websocketConnect(socket){

      console.log('New connection')
      io.emit('start', {ipHost: os.hostname()})

      socket.on('disconnect', socketDisconnect);
      socket.on('message', socketMessage);

      function socketDisconnect(e){
        console.log('Disconnect ', e);
      }

      function socketMessage(text){
        var message =  {text: text, date: new Date(), ip: os.hostname()};
  //      io.emit('message', message) // Instead of emitting the message on socket, it is being pushed on rabbitMQ queue.
        options.emitMessage(message);
      }
    }

    function onMessageReceived(message){

      io.emit('message', message)
    }

  });
} 

There's no change in the socket client whatsoever.

The whole project is given on the following link with docker image and docker compose files. You can try it out there.

https://github.com/saqibahmed515/chat-scaling

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM