简体   繁体   中英

Spring Cloud Stream + RabbitMQ - Consuming existing messages in queue

I have a RabbitMQ message broker running in a server, to which I'm trying to configure a Producer and a Consumer using Spring Cloud Stream. My Producer is creating messages in a queue every second, while my Consumer reads them at the same rate. However, if I stop my Consumer and the Producer keeps pushing messages, when I restart my Consumer again it is unable to retrieve the messages created in that period of time it was down, only picking up the messages produced from the time that it was started. How can I make my Consumer consume existing messages in the queue when it starts?

Here are my Consumer properties:

cloud:
    stream:
      bindings:
        input:
          destination: spring-cloud-stream-demo
          consumer:
            auto-bind-dlq: true
            republishToDlq: true
            maxAttempts: 5

And my Producer properties:

cloud:
    stream:
      bindings:
        output:
          destination: spring-cloud-stream-demo

Appreciate any help!

You need to add a group to the consumer (input) binding; otherwise it will bind an anonymous, auto-delete, queue to the exchange.

With a group , a permanent, durable, queue is bound instead.

I have a question too. Before the consumer gets up, I send data to the producer. After a certain time, I wake up the consumer. Consumer receives the printed data since it stands up. So it doesn't do a reading from the beginning. Is there any way to fix this? -GaryRussell

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM