简体   繁体   English

无法使用控制台使用者从 Kafka 主题读取消息

[英]Unable to read messages from Kafka topic using console consumer

I created stream1 in KSQL (version 5.0 Beta) with a backing topic topic1 and avro schema.我创建stream1KSQL与背衬主题(版本5.0 Beta版) topic1avro模式。 I am able to read all messages on topic1 using kafka-avro-console-consumer .我能够读取所有的消息topic1使用kafka-avro-console-consumer

I then created stream2 in KSQL that's based on stream1 but with json format for the messages and backing topic named topic2 .然后,我创建stream2KSQL这是基于stream1 ,但用json用于指定信息和后盾主题格式topic2 I am able to read all the messages on topic2 using kafka-console-consumer我能够使用kafka-console-consumer读取topic2上的所有消息

I created stream3 in KSQL based on stream2 with json message format and backing topic named topic3 .我创建stream3KSQL基于stream2json消息格式和后盾主题命名topic3 However, I am unable to read the messages on topic3 using kafka-console-consumer .但是,我无法使用kafka-console-consumer读取topic3上的消息。

Using kafkacat I get offsets on various partitions on topic3 but none of the actual messages is being printed.使用kafkacat我在topic3各个分区上获得偏移量,但没有打印任何实际消息。

It looks likes the messages are in the topic but neither kafkacat not kafka-console-consumer is able to print it.看起来消息在主题中,但kafkacat kafka-console-consumer都无法打印它。

Tried using --from-beginning and --offset earliest --partition 0 with no luck.尝试使用--offset earliest --partition 0 --from-beginning--offset earliest --partition 0没有运气。

Here are the KSQL statements这里是 KSQL 语句

CREATE STREAM stream1(p_id STRING, location STRING, u_id STRING, r_id STRING, b_id STRING, recorded_dtm STRING, 
v_type STRING, value STRING) WITH (kafka_topic='topic1', value_format='AVRO');

CREATE STREAM stream2 WITH (KAFKA_topic='topic2', VALUE_FORMAT='json', TIMESTAMP='RECORDED_TIMESTAMP') 
AS select P_ID+'-'+LOCATION+'-'+U_ID+'-'+R_ID+'-'+B_ID+'-'+V_TYPE as PARTITION_KEY, 
LOCATION, U_ID, R_ID, V_TYPE, B_ID, STRINGTOTIMESTAMP(recorded_dtm, 'yyyyMMddHHmmss') as RECORDED_TIMESTAMP, 
P_ID, VALUE, RECORDED_DTM,'NM' as DATA_TYPE 
FROM stream1 PARTITION BY PARTITION_KEY;

CREATE STREAM stream3 WITH (KAFKA_topic='topic3', VALUE_FORMAT='json', TIMESTAMP='RECORDED_TIMESTAMP') 
AS select PARTITION_KEY, LOCATION, U_ID, R_ID, V_TYPE, B_ID, RECORDED_TIMESTAMP, 
P_ID, VALUE, RECORDED_DTM FROM stream2 PARTITION BY PARTITION_KEY;

Additional info附加信息

In ksql if I run SET 'auto.offset.reset'='earliest';ksql如果我运行SET 'auto.offset.reset'='earliest'; and run select * from stream1 limit 5;并运行select * from stream1 limit 5; or select * from stream2 limit 5 I see records printed but select * from stream3 limit 5 does not return any records.select * from stream2 limit 5我看到记录打印但select * from stream3 limit 5不返回任何记录。

If I run describe extended stream3 I get如果我运行describe extended stream3我得到

total-messages: 212消息总数:212

which happens to be the number of messages I sent to topic1这恰好是我发送到 topic1 的消息数

The root cause was the Timestamp on STREAM3 The value for recorded_dtm column sent to messages on topic1 was earlier than the log.retention.hours value set in the kafka server.properties .根本原因是TimestampSTREAM3为值recorded_dtm发送到上TOPIC1消息柱早于是log.retention.hours中设置的值kafka server.properties

Our log.retention.hours value is set to 24 hours and the recorded_dtm values were earlier than 24 hours.我们的log.retention.hours值设置为24 hours ,而记录的log.retention.hours值早于 24 小时。 This caused the messages in STREAM3 and topic3 to be immediately removed based in the retention policy.这导致根据保留策略立即删除STREAM3topic3的消息。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用Spark Streaming Kafka无法从Kafka主题读取消息 - Unable To Read Messages From Kafka Topic Using Spark Streaming Kafka 消费者不读取来自 Kafka 主题的消息(Akka Stream Kafka) - Consumer does not read messages from the Kafka topic (Akka Stream Kafka) 无法从 ksqldb 中的 kafka 主题读取消息 - Unable to read messages from kafka topic in ksqldb 无法从 Kafka 中的消费者向死信主题发送消息 - Unable to send messages to dead letter topic from consumer in Kafka 根据消费者从Kafka主题中读取过滤器消息 - read filter messages from Kafka topic according to consumer 有没有办法在没有消费者的情况下从 Kafka 主题中读取消息? - Is there any way to read messages from Kafka topic without consumer? Camel Kafka:无法读取来自 Kafka 主题的消息 - Camel Kafka : Unable to read messages from Kafka topic 无法使用kafka-avro-console-consumer读取Avro消息。 SerializationException:未知魔术字节 - Unable to read avro messages using kafka-avro-console-consumer. SerializationException: Unknown magic byte 无法使用kafka使用者阅读kafka主题? - Not able to read kafka topic using kafka consumer? Spark Kafka Consumer不使用主题消息 - Spark Kafka Consumer not consuming messages from topic
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM