简体   繁体   English

消费者无法消费来自生产者的所有消息

[英]Consumer not able to consume all the message from Producer

I have created 3 separate container: 1 for Kafka, 2nd for Porducer(Streamer) and the last one for Consumer using docker-compose我已经创建了 3 个单独的容器:1 个用于 Kafka,2 个用于 Poducer(Streamer),最后一个用于使用 docker-compose 的消费者

    version: "3"

services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    networks:
      - stream-network
  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 3
    networks: 
        - stream-network
  streamer:
    build:
      context: ./streamingProducer/
    networks: 
      - stream-network
    depends_on:
      - kafka
  consumer:
    build:
      context: ./streamingConsumer/
    networks: 
      - stream-network
    depends_on:
      - kafka

I am producing 10 messages from the producer inside a container, below is the code:我从容器内的生产者那里生成 10 条消息,下面是代码:

    from confluent_kafka import Producer
import pprint
from faker import Faker
#from bson.json_util import dumps
import time


def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))


# Generating fake data

myFactory = Faker()
myFactory.random.seed(5467)

for i in range(10):

    data = myFactory.name()
    print("data: ", data)

    # Produce sample message from localhost
    # producer = KafkaProducer(bootstrap_servers=['localhost:9092'], retries=5)
    # Produce message from docker
    producer = Producer({'bootstrap.servers': 'kafka:29092'})

    producer.poll(0)

    #producer.send('live-transactions', dumps(data).encode('utf-8'))
    producer.produce('mytopic', data.encode('utf-8'))

    # block until all async messages are sent
producer.flush()
    # tidy up the producer connection
    # producer.close()
time.sleep(0.5)

and this is the following 10 output messages:这是以下 10 条 output 消息:

** **

streamer_1   | producer.py:35: DeprecationWarning: PY_SSIZE_T_CLEAN will be required for '#' formats
streamer_1   |   producer.produce('mytopic', data.encode('utf-8'))
streamer_1   | data:  Denise Reed
streamer_1   | data:  Megan Douglas
streamer_1   | data:  Philip Obrien
streamer_1   | data:  William Howell
streamer_1   | data:  Michael Williamson
streamer_1   | data:  Cheryl Jackson
streamer_1   | data:  Janet Bruce
streamer_1   | data:  Colton Martin
streamer_1   | data:  David Melton
streamer_1   | data:  Paula Ingram

** **

When I am trying to consume the message by consumer it only consumes the last message which is in this case: Paula Ingram and then the programs run forever like an infinite loop.当我试图通过消费者消费消息时,它只消费最后一条消息,在这种情况下是: Paula Ingram ,然后程序像无限循环一样永远运行。 Not sure whats wrong.不知道出了什么问题。 Here is the following code of the consumer:这是消费者的以下代码:

from kafka.consumer import KafkaConsumer
try:
    print('Welcome to parse engine')
    # From inside a container
    #consumer = KafkaConsumer('test-topic', bootstrap_servers='kafka:29092')
    # From localhost
    consumer = KafkaConsumer('mytopic', bootstrap_servers='localhost:9092', auto_offset_reset='earliest')
    for message in consumer:
        print('im a message')
        print(message.value.decode("utf-8"))

except Exception as e:
    print(e)
    # Logs the error appropriately. 
    pass

Any help would be appreciated.任何帮助,将不胜感激。 Thanks.谢谢。

I suspect you are having consumer group issues.我怀疑您遇到了消费者群体问题。

auto_offset_reset='earliest' only applies when a group does not already exist . auto_offset_reset='earliest'在组不存在时适用。 If a group exists, then the group resumes from the last available position.如果存在组,则该组从最后一个可用的 position 恢复。


If that is not the case, it is not clear in what order you are running the consumer and producer, but I would start the consumer first, then docker-compose restart streamer a few times如果不是这种情况,不清楚您以什么顺序运行消费者和生产者,但我会先启动消费者,然后docker-compose restart streamer几次

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Python kafka 消费者不会消费来自生产者的消息 - Python kafka consumer wont consume messages from producer Python消费者生产者。 生产者等待消费者消费 - Python Consumer-Producer. Producer waits for the consumer to consume 从生产者过程向所有消费者过程共享变量 - Share variable from producer process to all consumer processes 不会收到带有融合的简单Kafka生产者/消费者示例的消息吗? - Dont receive message with confluent kafka simple producer/consumer examples? 当消费者在 asyncio.Queue 生产-消费流程中引发异常时,通知生产者停止生产 - Notify producer to stop producing when consumer raise exception in the asyncio.Queue produce-consume flow 为什么我的消费者在队列中与我的生产者分开工作? - Why is my consumer working separately from my producer in the queue? 如何从Python中的主线程终止Producer-Consumer线程? - How to terminate Producer-Consumer threads from main thread in Python? Kafka Consumer没有收到来自其生产者的任何消息 - Kafka Consumer didn't receiving any messages from its Producer 消费者/生产者“及时”排队 - Consumer/Producer “timely” queue 海带生产者和芹菜消费者 - kombu producer and celery consumer
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM