简体   繁体   中英

Kafka Streams API: KStream to KTable 1

I have a cluster kafka with 3 brokers and I have 300 topics.

I want to start processing using API kafka streams on each topic independently with others and write the results of each input topic to specific topic

inputtopic1 to outputtopic1 
inputtopic2  to outputtopic2
inputtopic3 to outputtopic3
inputtopic4 to outputtopic4 

etc

Any ideas ?

http://docs.confluent.io/current/streams/developer-guide.html#writing-streams-back-to-kafka

To() -> Terminal operation. Write the records to a Kafka topic. (KStream details, KTable details)

When to provide serdes explicitly:

If you do not specify serdes explicitly, the default serdes from the configuration are used. You must specificy serdes explicitly if the key and/or value types of the KStream or KTable do not match the configured default serdes. See Data types and serialization for information about configuring default serdes, available serdes, and implementing your own custom serdes. Several variants of to exist to eg specify a custom StreamPartitioner that gives you control over how output records are distributed across the partitions of the output topic.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM