简体   繁体   中英

Copy data between kafka topics using kafka connectors

I'm new to Kafka, now I need to copy data from one kafka topic to another. I'm wondering what are the possible ways to do so? The ways I can think of are following:

  1. Kakfa consumer + Kafka producer
  2. Kafka streams
  3. Kafka sink connector + producer
  4. Kafka consumer + source connector

My questions is: is that possible to use two kafka connectors in between? Eg sink connector + source connector. Is so, could you please provide me some good examples? Or some hints of how to do so?

Thanks in advance!

All the methods you listed are possible. Which one is the best really depends on the control you want over the process or whether it's a one off operation or something you want to keep running.

Kafka Streams offers a easy way to flow one topic into another via the DSL

You could do something like (demo code obviously not for production:):

Properties props = new Properties();
props.put(StreamsConfig.APPLICATION_ID_CONFIG, "streams-wordcount");
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9092");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");

final Serde<byte[]> bytesSerdes = Serdes.ByteArray();
final StreamsBuilder builder = new StreamsBuilder();
KStream<byte[], byte[]> input = builder.stream(
        "input-topic",
        Consumed.with(bytesSerdes, bytesSerdes)
);
input.to("output-topic", Produced.with(bytesSerdes, bytesSerdes));

final KafkaStreams streams = new KafkaStreams(builder.build(), props);
try {
    streams.start();
    Thread.sleep(60000L);
} catch (Exception e) {
    e.printStackTrace();
} finally {
    streams.close();
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM