简体   繁体   中英

RabbitMQ: multiple queues/one (long) task at a time

I'm using RabbitMQ to manage multiple servers executing long lasting tasks. Each server can listen to one or more queues, but each server should process only one task at a time.

Each time I start a consumer in a server, I configure it with channel.basic_qos(prefetch_count=1) so that only one tasks is processed for the respective queue.

Suppose we have: - 2 queues: task1, task2. - 2 servers: server1, server2. - Both servers work with task1 and task2.

If the next messages are produced at the same time: - messageA for tasks1 - messageB for tasks2 - messageC for tasks1

What I expect: - messageA gets processed by server1 - messageB gets processed by server2 . messageC stays queued until one of the servers is ready (finishes its current task).

What I actually get: - messageA gets processed by worker1 - messageB gets processed by worker2 - messageC gets processed by worker2 (WRONG)

I do not start consumers at the same time. In fact, working tasks are constantly being turned on/off in each server. Most of the time servers work with different queues (server1: tasks1, tasks2, task3; server2: tasks1, tasks5; server3: tasks2, tasks5; and so on).

How could I manage to do this?

EDIT Based on Olivier's answer: Tasks are different. Each server is able to handle some of the tasks, not all of them. A server could process only one task at a time.

I tried using exchanges with routing_keys, but I found two problems: all of the servers binded to the routing key task_i would process its tasks (I need it to be processed only once), and if there is no server binded to task_i, then its messages are dropped (I need to remain queued until some server can handle it).

Seems from the description you provide that the issue is due to your servers connecting to multiple queues at the same time.

As your prefetch count is set to 1, a server connected to 3 queues will consume up to 3 messages even though he will only be processing one at a time (per your description of processing).

It's not clear from your question whether there is a need for multiple queues or whether you could have all tasks end up in a single queue:

  • do all the servers consume all the tasks
  • do you need to be able to stop the processing of certain tasks

If you need/wish to be able to "stop" the processing of certain tasks, or control the distribution of processing throughout your servers, you'll need to manage consumers in your servers to only have one active consumer at a time (otherwise you're going to block/consume some messages due to prefetch 1).

If you do not need to control the processing of the various tasks, would be far simpler to have all of the messages end up in a single queue, and single consumer to that queue setup with prefetch one for each of your servers.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM