[英]How does KafkaStream.KTable write data to kafka topic in (compacted) KV style
在Kafka(0.11.0.1)Streams中,一個演示應用程序與Streams應用程序一起玩
// Serializers/deserializers (serde) for String and Long types
final Serde<String> stringSerde = Serdes.String();
final Serde<Long> longSerde = Serdes.Long();
// Construct a `KStream` from the input topic "streams-plaintext-input", where message values
// represent lines of text (for the sake of this example, we ignore whatever may be stored
// in the message keys).
KStream<String, String> textLines = builder.stream(stringSerde, stringSerde, "streams-plaintext-input");
KTable<String, Long> wordCounts = textLines
// Split each text line, by whitespace, into words.
.flatMapValues(value -> Arrays.asList(value.toLowerCase().split("\\W+")))
// Group the text words as message keys
.groupBy((key, value) -> value)
// Count the occurrences of each word (message key).
.count("Counts")
// Store the running counts as a changelog stream to the output topic.
wordCounts.to(stringSerde, longSerde, "streams-wordcount-output");
第5步,在處理了一些數據之后,我們可以在接收器主題streams-wordcount-output中看到壓縮的KV對(例如stream 2 ),
> bin/kafka-console-consumer.sh --bootstrap-server localhost:9092
--topic streams-wordcount-output \
--from-beginning \
--formatter kafka.tools.DefaultMessageFormatter \
--property print.key=true \
--property print.value=true \
--property key.deserializer=org.apache.kafka.common.serialization.StringDeserializer \
--property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
all 1
streams 1
lead 1
to 1
kafka 1
hello 1
kafka 2
streams 2
問題是上述數據中的KTable wordCounts如何以鍵值樣式將數據寫入主題流-wordcount-output ?
主題streams-wordcount-output的選項cleanup.policy似乎是默認值delete
,而不是compact
(通過bin / kafka-configs.sh)
所有輸入和輸出主題都在Kafka Streams的“范圍之外”。 創建和配置這些主題是用戶的責任。
因此,您的主題"streams-wordcount-output"
將具有您在創建主題時指定的配置。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.