简体   繁体   English

如何配置从 SQS 获取消息并将它们移动到 Kafka 主题的 Kafka 连接器?

[英]How can I configure a Kafka connector that takes messages from SQS and moves them to a Kafka topic?

I have a use case where I want to move messages from SQS to a Kafka topic.我有一个用例,我想将消息从 SQS 移动到 Kafka 主题。 The framework to be used is SpringBoot.要使用的框架是SpringBoot。 So, whenever I run my code it should start moving the messages.所以,每当我运行我的代码时,它应该开始移动消息。 I searched for some articles but there were very few.我搜索了一些文章,但很少。 I am looking for some boilerplate code to start with that follow the best practices and how to proceed further.我正在寻找一些样板代码,以遵循最佳实践以及如何进一步进行。

Thanks in advance.提前致谢。

You need to make yourself familiar with Enterprise Integration Patterns and its Spring Integration implementation.您需要熟悉企业集成模式及其Spring 集成实现。

To take messages from AWS SQS you would need to use an SqsMessageDrivenChannelAdapter from a Spring Integration for AWS extension.要从 AWS SQS 获取消息,您需要使用SqsMessageDrivenChannelAdapter Integration for AWS扩展中的 SqsMessageDrivenChannelAdapter。 To post records into an Apache Kafka topic you need a KafkaProducerMessageHandler from the spring-integration-kafka module.要将记录发布到 Apache Kafka 主题中,您需要spring-integration-kafka模块中的KafkaProducerMessageHandler

Then you wire everything together via an IntegrationFlow bean in your Spring Boot configuration.然后,通过 Spring 引导配置中的IntegrationFlow bean 将所有内容连接在一起。

Of course you can use Spring Cloud for AWS and Spring for Apache Kafka directly.当然,您可以直接将Spring Cloud 用于 AWSSpring 用于 Apache Kafka Choice is yours, but better to follow best practice and start developing really an integration solution.选择权在您手中,但最好遵循最佳实践并开始开发真正的集成解决方案。

Apache Kafka offers multiple ways to ingest data from different sources eg Kafka Connect , Kafka Producer , etc., and we need to be careful while selecting specific components of Kafka by keeping certain things in mind such as retry mechanism , scalability , etc. Apache Kafka提供了多种方式来从不同来源摄取数据,例如Kafka ConnectKafka Producer等,我们在选择 Kafka 的特定组件时需要小心,记住某些事情,例如重试机制可伸缩性等。

The best solution, in this case, would be to use Amazon SQS Source Connector to ingest data from AWS SQS into Kafka topic and then write your consumer application to do whatever is necessary with the stream of records of that particular topic .在这种情况下,最好的解决方案是使用Amazon SQS 源连接器将来自AWS SQS的数据提取Kafka topic中,然后编写您的消费者应用程序以使用该特定topic的记录 stream 执行任何必要的操作。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM