简体   繁体   中英

nodejs http response.write: is it possible out-of-memory?

If i have following code to send data repeatedly to client every 10ms:

setInterval(function() {
    res.write(somedata);
}, 10ms);

What would happen if the client is very slow to receive the data?

Will server get out-of-memory error?

Edit: actually the connection is kept alive, sever send jpeg data endlessly (HTTP multipart/x-mixed-replace header + body + header + body.....)
Because node.js response.write is asynchronous,
so some users guess it may store data in internal buffer and wait until low layer tells it can send,
so the internal buffer will grow, am i right?

If i am right, then how to resolve this?
the problem is node.js does not notify me when data is send for a single write call.

In other word, i can not tell user this way is theoretically no risk of "out of memory" and how to fix it.


By the keyword "drain" event given by user568109, i studied the source of node.js, and got conclusion: 通过user568109给出的关键字“ drain”事件,我研究了node.js的来源,并得出了结论:

http.js:

OutgoingMessage.prototype._buffer = function(data, encoding) {
  this.output.push(data); //-------------No check here, will cause "out-of-memory"
  this.outputEncodings.push(encoding);

  return false;
};


OutgoingMessage.prototype._writeRaw = function(data, encoding) { //this will be called by resonse.write
  if (data.length === 0) {
    return true;
  }

  if (this.connection &&
      this.connection._httpMessage === this &&
      this.connection.writable &&
      !this.connection.destroyed) {
    // There might be pending data in the this.output buffer.
    while (this.output.length) {
      if (!this.connection.writable) {    //when not ready to send
        this._buffer(data, encoding);    //----------> save data into internal buffer
        return false;
      }
      var c = this.output.shift();
      var e = this.outputEncodings.shift();
      this.connection.write(c, e);
    }

    // Directly write to socket.
    return this.connection.write(data, encoding);
  } else if (this.connection && this.connection.destroyed) {
    // The socket was destroyed.  If we're still trying to write to it,
    // then we haven't gotten the 'close' event yet.
    return false;
  } else {
    // buffer, as long as we're not destroyed.
    this._buffer(data, encoding);
    return false;
  }
};

Some gotchas:

  1. If sending over http it is not be a good idea. The browser may consider the request as timeout if it is not finished within specified amount of time. Server too will close connection which is idle for too long. If client cannot keep up, the timeout is almost certain.

  2. setInterval for 10ms is also subject to some restrictions. It doesn't mean it will repeat after every 10ms, 10ms is the minimum it will wait before repeating. It will be slower than what you set the interval.

  3. Let's say you chance to overload the response with data, then at some point the server will end connection and respond by 413 Request Entity Too Large depending on what the limit is set.

  4. Node.js has single threaded architecture with a max memory limitation of around 1.7 GB. If you set your above server limits to too high and have many incoming connections you will get process out of memory error.

So with appropriate limits it will either give timeout or be request too large. (And there are no other errors in your program.)

Update

You need to use drain event. The http response is a writable stream. It has its own internal buffer. When the buffer is emptied the drain event is triggered. You should learn more about streams as you would go in deeper. This will help you not just in http. You can find several resources about streams on web.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM