简体   繁体   中英

Managing connections on a command based TCP socket API in node.js

I built a RESTful API based on expresss.js which communicates to a remote server through a TCP socket using JSON. Requested URLs are converted into the appropriate JSON messages, a new TCP socket is open and the message is sent. Then when a message coming from the same connection is received an event is fired, the JSON reply is evaluated and a new JSON message is returned as the result of the GET request.

Possible paths:

  1. Async (currently in use) - Open a connection to the server for each request.
  2. Sync - Create a queue with all the requests and wait for the response, blocking code.
  3. Track - Send all the request at once and asynchronously receive the answers. Use a tracker id on the request to relate each request with its answer.

What will be the best direction to go? Is there any common pattern to solve this kind of application?

1 (async, a new connection for each request) is probably the easiest to implement.

If you want to reuse the socket for efficiently, you should come up with your own "keep-alive" mechanism - essentially streaming multiple requests and answers using the same socket.

I'd probably use double CRLF ('\\n\\r\\n\\r') as the delimiter of each JSON request, fire a 'request' event for each request, and simply write back the answer asynchronously. Delimiter-less streaming is possible, but it requires extra parsing when you receive a partial JSON string from the socket.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM