[英]Copy data between kafka topics using kafka connectors
I'm new to Kafka, now I need to copy data from one kafka topic to another.我是 Kafka 的新手,现在我需要将数据从一个 kafka 主题复制到另一个主题。 I'm wondering what are the possible ways to do so?
我想知道这样做的可能方法是什么? The ways I can think of are following:
我能想到的方法如下:
My questions is: is that possible to use two kafka connectors in between?我的问题是:是否可以在两者之间使用两个 kafka 连接器? Eg sink connector + source connector.
例如接收器连接器+源连接器。 Is so, could you please provide me some good examples?
是这样吗,你能给我一些好的例子吗? Or some hints of how to do so?
或者一些提示如何做到这一点?
Thanks in advance!提前致谢!
All the methods you listed are possible.您列出的所有方法都是可能的。 Which one is the best really depends on the control you want over the process or whether it's a one off operation or something you want to keep running.
哪一个是最好的实际上取决于您想要对过程进行控制,或者它是一次性操作还是您想要继续运行的东西。
Kafka Streams offers a easy way to flow one topic into another via the DSL Kafka Streams 提供了一种通过 DSL 将一个主题流入另一个主题的简单方法
You could do something like (demo code obviously not for production:):您可以执行类似的操作(演示代码显然不适用于生产:):
Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-wordcount");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
final Serde<byte[]> bytesSerdes = Serdes.ByteArray();
final StreamsBuilder builder = new StreamsBuilder();
KStream<byte[], byte[]> input = builder.stream(
"input-topic",
Consumed.with(bytesSerdes, bytesSerdes)
);
input.to("output-topic", Produced.with(bytesSerdes, bytesSerdes));
final KafkaStreams streams = new KafkaStreams(builder.build(), props);
try {
streams.start();
Thread.sleep(60000L);
} catch (Exception e) {
e.printStackTrace();
} finally {
streams.close();
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.