簡體   English   中英

消費者無法消費來自生產者的所有消息

[英]Consumer not able to consume all the message from Producer

我已經創建了 3 個單獨的容器:1 個用於 Kafka,2 個用於 Poducer(Streamer),最后一個用於使用 docker-compose 的消費者

    version: "3"

services:
  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    networks:
      - stream-network
  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - 9092:9092
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 3
    networks: 
        - stream-network
  streamer:
    build:
      context: ./streamingProducer/
    networks: 
      - stream-network
    depends_on:
      - kafka
  consumer:
    build:
      context: ./streamingConsumer/
    networks: 
      - stream-network
    depends_on:
      - kafka

我從容器內的生產者那里生成 10 條消息,下面是代碼:

    from confluent_kafka import Producer
import pprint
from faker import Faker
#from bson.json_util import dumps
import time


def delivery_report(err, msg):
    """ Called once for each message produced to indicate delivery result.
        Triggered by poll() or flush(). """
    if err is not None:
        print('Message delivery failed: {}'.format(err))
    else:
        print('Message delivered to {} [{}]'.format(msg.topic(), msg.partition()))


# Generating fake data

myFactory = Faker()
myFactory.random.seed(5467)

for i in range(10):

    data = myFactory.name()
    print("data: ", data)

    # Produce sample message from localhost
    # producer = KafkaProducer(bootstrap_servers=['localhost:9092'], retries=5)
    # Produce message from docker
    producer = Producer({'bootstrap.servers': 'kafka:29092'})

    producer.poll(0)

    #producer.send('live-transactions', dumps(data).encode('utf-8'))
    producer.produce('mytopic', data.encode('utf-8'))

    # block until all async messages are sent
producer.flush()
    # tidy up the producer connection
    # producer.close()
time.sleep(0.5)

這是以下 10 條 output 消息:

**

streamer_1   | producer.py:35: DeprecationWarning: PY_SSIZE_T_CLEAN will be required for '#' formats
streamer_1   |   producer.produce('mytopic', data.encode('utf-8'))
streamer_1   | data:  Denise Reed
streamer_1   | data:  Megan Douglas
streamer_1   | data:  Philip Obrien
streamer_1   | data:  William Howell
streamer_1   | data:  Michael Williamson
streamer_1   | data:  Cheryl Jackson
streamer_1   | data:  Janet Bruce
streamer_1   | data:  Colton Martin
streamer_1   | data:  David Melton
streamer_1   | data:  Paula Ingram

**

當我試圖通過消費者消費消息時,它只消費最后一條消息,在這種情況下是: Paula Ingram ,然后程序像無限循環一樣永遠運行。 不知道出了什么問題。 這是消費者的以下代碼:

from kafka.consumer import KafkaConsumer
try:
    print('Welcome to parse engine')
    # From inside a container
    #consumer = KafkaConsumer('test-topic', bootstrap_servers='kafka:29092')
    # From localhost
    consumer = KafkaConsumer('mytopic', bootstrap_servers='localhost:9092', auto_offset_reset='earliest')
    for message in consumer:
        print('im a message')
        print(message.value.decode("utf-8"))

except Exception as e:
    print(e)
    # Logs the error appropriately. 
    pass

任何幫助,將不勝感激。 謝謝。

我懷疑您遇到了消費者群體問題。

auto_offset_reset='earliest'在組不存在時適用。 如果存在組,則該組從最后一個可用的 position 恢復。


如果不是這種情況,不清楚您以什么順序運行消費者和生產者,但我會先啟動消費者,然后docker-compose restart streamer幾次

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM