简体   繁体   中英

“Split” one request to multiple servers

Is it possible to split a request to multiple endpoint servers ?

So for example: If server A receives one request it will then forward it to 3 different servers (B,C and D) which will receive that original request with X-Forwarded-For header.

              forward
request => A ========> B  
             ========> C
             ========> D

The response for that matter can be a simple 200 OK.

I have played around with HAProxy and Apache's mode_proxy so far with no success.

Also read about HTTP Pipelining but I don't think this is the case here.

Gor was written to do exactly this.

Hat tip to Arthur Lutz for posting the answer here .

Hm. How would I do that? Most networking setups, wether they are doing a proxy request or load balancing will retain a one-to-one relationship between the incoming request and the final destination for that specific request. I think you would need some code in between. I think I'd use mod_perl or a custom apache module in C to intercept the request at an early phase and do the multiple routing with sub-requests.

Here is a reverse proxy / load balancer that I've written for a similar purpose with Node.js (it is just for fun, not production ready at the moment).

https://github.com/losnir/ampel

It is very opinionated, and currently supports:

  • GET Using round-robin selection (1:1)
  • POST Using request splitting. There is no concept of "master" and "shadow" -- the first backend that responds is the one that will serve the client request, and then all of the other responses will be discarded.

If someone finds it useful then I can improve it to be more flexible.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM