简体   繁体   English

如何在使用者MQ主题中使用Kafka主题

[英]How to consume Kafka Topic in Consumer MQ Topic

I have a requirement where I need to consume Kafka Topic and write it into MQ Topic. 我有一个需要使用卡夫卡主题并将其写入MQ主题的需求。 Can someone advise me the best way to do it, I am new to Kafka. 有人可以建议我最好的方法吗,我是Kafka的新手。

I have read about the IBM MQ Connector in confluent but could not get the idea how to implement it. 我已经阅读了有关融合的IBM MQ连接器的信息 ,但是却不知道如何实现它。

The best way to move data from Kafka to MQ is to use the IBM MQ sink connector: https://github.com/ibm-messaging/kafka-connect-mq-sink 将数据从Kafka迁移到MQ的最好方法是使用IBM MQ接收器连接器: https : //github.com/ibm-messaging/kafka-connect-mq-sink

This is a Kafka Connect connector. 这是一个Kafka Connect连接器。 The README contains details for building and running it. 自述文件包含有关构建运行它的详细信息。

Kafka has a component called Kafka Connect. Kafka有一个称为Kafka Connect的组件。 It is used to read and write data to/from Kafka into other systems such as Database in your case MQ. 它用于从Kafka到其他系统(如您的MQ)的数据库中读写数据。

Kafka connect can have two kind of connectors Kafka Connect可以有两种连接器

Source connectors - Read data from an external system and write to Kafka (For eg. Read inserted/modified rows from a table in DB and insert into a topic in Kafka) 源连接器-从外部系统读取数据并写入Kafka(例如,从DB表中读取已插入/已修改的行并插入Kafka中的主题)

Sink Connector - Read message from Kafka write to external system. 接收器连接器-从Kafka读取消息,将消息写入外部系统。

The link you have added is a source connector, it will read messages from the MQ and write to Kafka. 您添加的链接是一个源连接器,它将从MQ读取消息并写入Kafka。

For simple use case you do not need Kafka connect. 对于简单的用例,您不需要Kafka连接。 You can write a simple Kafka consumer that will read data from Kafka topic and write it to MQ. 您可以编写一个简单的Kafka使用者 ,它将从Kafka主题读取数据并将其写入MQ。

    Properties props = new Properties();
     props.put("bootstrap.servers", "localhost:9092");
     props.put("group.id", "test");
     props.put("enable.auto.commit", "true");
     props.put("auto.commit.interval.ms", "1000");
     props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
     props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
     KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
     consumer.subscribe(Arrays.asList("foo", "bar"));
     while (true) {
         ConsumerRecords<String, String> records = consumer.poll(100);
         for (ConsumerRecord<String, String> record : records)
//Insert code to append to MQ here
             System.out.printf("offset = %d, key = %s, value = %s%n", record.offset(), record.key(), record.value());
     }

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM