简体   繁体   English

使用接收器连接器将Kafka转换为Google Pub / Sub

[英]Kafka to google Pub/Sub using Sink Connector

Sorry for the basic question. 对不起这个基本问题。

Requirement: Need to send json payload from spring boot application to google pub sub. 要求:需要将Spring引导应用程序中的json有效负载发送到Google pub sub。

I have started implementing kafka producer and publishing message to topics and from there to kafka consumer is receiving the json. 我已经开始实现kafka生产者并向主题发布消息,从那里到kafka消费者正在接收json。

Now i need to send the json payload from kafka consumer to google pub/sub. 现在,我需要将来自Kafka使用者的json有效负载发送到Google pub / sub。 Iam confused, should i need kafka consumer for this.. Or just from kafka producer to google pub sub i can send by using the kafka connector.jar and configuring in the topics in the properties. Iam感到困惑,我是否应该为此需要kafka使用者。或者我可以从kafka生产者到google pub sub发送邮件,方法是使用kafka connector.jar并在属性中的主题中进行配置。 Any help is much appreciated for this implementation 任何帮助对此实现表示感谢

If you want to get messages from Kafka into Google Cloud Pub/Sub, you should not need to write your own consumer. 如果要将消息从Kafka发送到Google Cloud Pub / Sub,则无需编写自己的使用者。 You can create an instantiation of the Google Cloud Pub/Sub Kafka connector . 您可以创建Google Cloud Pub / Sub Kafka连接器的实例化。 The Kafka Connect service is usually part of the Kafka deployment itself, so you just need to start an instance of it configured to run the Cloud Pub/Sub sync connector. Kafka Connect服务通常是Kafka部署本身的一部分,因此您只需要启动其实例即可配置为运行Cloud Pub / Sub同步连接器。 The README file for the connector details the steps that need to be taken, but to summarize: 连接器的README文件详细说明了需要采取的步骤,但总结如下:

  1. Download the latest release and build it via the mvn package command. 下载最新版本并通过mvn package命令进行构建。
  2. Copy target/cps-kafka-connector.jar to a place in your Java path so it accessible when Kafka Connect runs. 将target / cps-kafka-connector.jar复制到Java路径中的某个位置,以便在Kafka Connect运行时可以访问。
  3. Copy the sink config and change it to point to the appropriate Cloud Pub/Sub project and topic and the appropriate Kafka topics. 复制接收器配置 ,并将其更改为指向相应的Cloud Pub / Sub项目和主题以及相应的Kafka主题。
  4. Make a copy of the config/connect-standalone.properties or config/connect-distributed.properties based on whether or not you want to have a single instance or multiple instances of the connector running. 根据您要运行连接器的单个实例还是多个实例,制作config / connect-standalone.properties或config / connect-distributed.properties的副本。
  5. Update the key.converter and value.converter properties to be org.apache.kafka.connect.storage.StringConverter . key.convertervalue.converter属性更新为org.apache.kafka.connect.storage.StringConverter This way, the connector will not try to interpret the data being passed but will instead just pass the JSON straight through to Cloud Pub/Sub. 这样,连接器将不会尝试解释正在传递的数据,而只会将JSON直接传递给Cloud Pub / Sub。
  6. Start up the connector with the appropriate command based on the standalone vs. distributed connector, eg, bin/connect-standalone.sh <standalone config file> <connector config file> . 使用基于独立连接器与分布式连接器的适当命令来启动连接器,例如bin/connect-standalone.sh <standalone config file> <connector config file>

Messages should now be flowing from Kafka into Google Cloud Pub/Sub. 现在,邮件应该已经从Kafka流入Google Cloud Pub / Sub。 If you are just using Kafka to go from Spring Boot to Cloud Pub/Sub, then you could avoid the Kafka step by setting up an outbound channel adapter to send messages to Cloud Pub/Sub directly . 如果您只是使用Kafka从Spring Boot转到Cloud Pub / Sub,则可以通过设置出站通道适配器直接将消息发送到Cloud Pub / Sub来避免Kafka步骤。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM