简体   繁体   English

卡夫卡消费者不消费主题产生的数据

[英]Kafka consumer not consuming the data produced into topic

I have to create a kafka producer that generates a sequence of numbers from 1 to 300. Each of the message I wrote have to contain information about the topic, a key and the value which is a the binary value of the value to write.我必须创建一个生成从 1 到 300 的数字序列的 kafka 生产者。我写的每条消息都必须包含有关主题的信息、一个键和一个值,该值是要写入的值的二进制值。

This is the code I've created:这是我创建的代码:

from kafka import KafkaProducer
import numpy as np
import time

producer = KafkaProducer(bootstrap_servers='Cloudera02:9092')

for i in range(1,300):
    value = bytes(str(i), 'utf-8')
    key = (str(i), 'utf-8')
    producer.send('PEC5', key = key, value = value)
    time.sleep(3) 
producer.flush()

The kafka consumer should read the producer and show only the value in the console. kafka 消费者应该读取生产者并仅在控制台中显示值。

from kafka import KafkaConsumer
from codecs import utf_8_decode

consumer = KafkaConsumer('PEC5', bootstrap_servers='Cloudera02:9092', auto_offset_reset='smallest', consumer_timeout_ms=10000)

for message in consumer:
    for value in message.values:
        print(value)

I am running 2 terminals, one with the producer and a second with the consumer, but I don't get anything printed in the console.我正在运行 2 个终端,一个用于生产者,第二个用于消费者,但我没有在控制台中打印任何内容。 Any idea what's wrong?知道出了什么问题吗?

The auto_offset_reset value is wrong in the KafkaConsumer , from the docs it only has two valid values latest and earliest and Any other value will raise the exception KafkaConsumer中的auto_offset_reset值是错误的,从文档来看它只有latestearliest的两个有效值,任何其他值都会引发异常

auto_offset_reset (str) – A policy for resetting offsets on OffsetOutOfRange errors: 'earliest' will move to the oldest available message, 'latest' will move to the most recent. auto_offset_reset (str) – 重置 OffsetOutOfRange 错误偏移量的策略:“最早”将移动到最旧的可用消息,“最新”将移动到最新消息。 Any other value will raise the exception.任何其他值都会引发异常。 Default: 'latest'.默认值:“最新”。

So construct KafkaConsumer with earliest所以earliest构造KafkaConsumer

consumer = KafkaConsumer('PEC5', bootstrap_servers='Cloudera02:9092', auto_offset_reset='earliest', consumer_timeout_ms=10000)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 pyspark 使用 Kafka 主题失败 - Failed on consuming Kafka topic with pyspark 使用 kafka 0.8.2.0 跟踪主题大小和消费者滞后 - Tracking topic size and consumer lag with kafka 0.8.2.0 如何反序列化Kafka内部主题__consumer_offsets数据 - How to de-serialize kafka internal topic __consumer_offsets data Kafka Consumer:如何从 Python 中的最后一条消息开始消费 - Kafka Consumer: How to start consuming from the last message in Python 看不到 Kafka 主题中的数据 - Unable to see Data in Kafka Topic 当主题中没有新消息时如何停止 Kafka 消费者 - How to stop a Kafka consumer when no new messages are in the topic Python kafka:有没有办法在发布新消息之前阻止消费者关注kafka主题? - Python kafka: Is there a way to block a consumer on a kafka topic till a new message is posted? 无法使用 confluent-kafka-python 从 Kafka Topic 消费到新的消费者组 - Cannot consume from Kafka Topic using confluent-kafka-python to a new consumer-group 通过 kafka-python 库检查 python 中是否存在 kafka 主题,不使用消费者和 shell 命令 - Check whether kafka topic exists or not in python via kafka-python libraries and without using consumer and shell commands 使用者重启后,kafka-python从最后产生的消息中读取 - kafka-python read from last produced message after a consumer restart
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM