简体   繁体   中英

Defer handling of incoming HTTP requests in Node

I am deliberately keeping this at a high level because I suspect I'm missing some fundamental concept here.

I am running a Node/Express server. I want this server to effectively act as a throttle, so I want to store (in memory or another process, like Redis) incoming http requests to be handled (with response.end()) at a later time. Is this possible?

Is this possible?

No. You could store state/information about the request anywhere you want, but the request itself will fundamentally be tied back to some file descriptor that ties back to an actual open socket handling the request, and there's really no way to move that around.

Is this possible?

Yes, but with some limitations.

First off, if you aren't going to change the design of the client making the request, you will HAVE to respond on the very socket that the HTTP connection came in on. An HTTP connection is a live socket connection from client to server. The response must be sent on that socket. That means if you're going to wait awhile before responding that you will have to keep that socket alive and find a way to respond before the socket hits a timeout and gives up waiting for the response. Browsers will all have some sort of client side timeout set and if they don't receive a response within that amount of time, they will close the socket and assume the request just wasn't successful. You would have to either respond in less time than this timeout or you would have to coerce the client into lengthening it's timeout by dribbling bits and pieces of some sort of response out slowly to make the client think that data is coming (I haven't tried this myself).

FYI, incoming connections cannot be "stored" in another process or on disk. You will have to keep track of them in your node/express server.


If you can change the design of the client, then there are all sorts of things you could do. For example, the client could make a request and the response it gets immediately could be a unique ID and a time value. This tells the client that it's response has been "deferred" until later. Meanwhile, the server stores the request and sometime before the scheduled deferred time, it can compute that response. Then, when the client checks back in again and asks for the response for a particular unique ID, the server can then return that response.


If you can use a socket.io (or even a plain webSocket) connection, then you can do it a little easier. Since the webSocket-based connection is persistent, the client can make the request and then sometime later whenever the server feels like it, it can send the response to that client's webSocket connection.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM