简体   繁体   中英

I need to save data in mongodb and send this data to kafka all atomically

I have spring boot application which persists data to mongodb and sends this data to kafka. I want this two processes to run atomically. That means if data is persisted to mongo then it should be sent to kafka. How can I do it?

With Kafka itself you can't.

Kafka offers transactions , but they are restricted to write to multiple partitions in Kafka atomically. They are designed with stream processing in mind, so consuming from one topic and producing to another in one go - but a Kafka transaction cannot know whether a write command to mongo succeeded.

The usecase you have is something that appears regularly though. Usually you would use the outbox pattern . The answer is to only modify one of the two resources (the database or Apache Kafka) and drive the update of the second one based on that in an eventually consistent manner.

If you really need atomic writes, I believe it is possible to do so by relying on the ACID guarantees Mongo >= 4.2 gives you instead of Kafka's transactional guarantees. But this would mean you need to manage the Kafka offsets in Mongo.

If you have "Kafka the definitive guide, 2nd edition", there is a small chapter with more details about what exactly Kafka transactions can and cannot do and possible workarounds.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM