简体   繁体   English

订阅3个以上主题时,Kafka消费者未收到消息

[英]Kafka Consumer Not Receiving Messages When Subscribed to 3+ Topics

I'm using node-rdkafka ( https://github.com/Blizzard/node-rdkafka ) to consume and produce messages from an IBM Bluemix Message Hub. 我正在使用node-rdkafka( https://github.com/Blizzard/node-rdkafka )来消费和产生来自IBM Bluemix Message Hub的消息。 I'm able to consume and produce messages without issue, as long as I only subscribe to a max of 2 topics per node process. 只要每个节点进程最多只能订阅2个主题,我就可以使用和产生消息而不会出现问题。 As soon as I subscribe to three or more topics, my consumer no longer receives any messages on any of the subscribed topics. 一旦我订阅了三个或更多主题,我的消费者就不再收到有关任何已订阅主题的消息。 I don't see any errors. 我没有看到任何错误。

Is there a soft limit here? 这里有软限制吗? Or is there something in my code causing this issue? 还是我的代码中有什么导致此问题? Bumping up server memory don't seem to have any effect. 提高服务器内存似乎没有任何效果。

Producer code: 生产者代码:

events.send = events.produce = (topic, type, data) => {
  log.info('Sending message on topic ' + topic);

  let producer = lib.getProducer(hubConfig);

  // Connect to the broker manually
  producer.connect({}, (err) => {
    if (err) {
      log.error('Producer failed to connect');
      log.error(err);
    }
  });

  // Wait for the ready event before proceeding
  producer.on('ready', () => {
    log.info('Producer ready, sending message');
    try {
      producer.produce(
        topic,
        null,
        new Buffer(JSON.stringify(data)),
        type,
        Date.now()
      );
    } catch (err) {
      log.error('A problem occurred when sending our message');
      log.error(err);
    }
  });

  producer.on('event.error', (err) => {
    log.error('Error from producer');
    log.error(err);
  })
};

lib.getProducer = (hubConfig) => {
  return new Kafka.Producer({
      'metadata.broker.list': hubConfig.kafka_brokers_sasl.join(','),
      'security.protocol': 'sasl_ssl',
      'ssl.ca.location': '/etc/ssl/certs',
      'sasl.mechanisms': 'PLAIN',
      'sasl.username': hubConfig.user,
      'sasl.password': hubConfig.password,
      'api.version.request': true,
      'dr_cb': true,
      'event_cb': true
    });
};

Consumer: 消费者:

events.listen = events.consume = (topics, callback) => {
    if (!_.isArray(topics)) {
      topics = [topics];
    }
    log.info('Subscribing to ' + topics.join(', ') + ' on test event listener...');
    let consumer,
      emitter = new evt.EventEmitter(),

      // Each consumer has a unique group and client ID
      groupName = 'group-' + uuidv1(),
      clientName = 'client-' + uuidv1();

    consumer = lib.getConsumer(hubConfig, groupName, clientName);

    consumer.connect({}, (err) => {
      if (err) {
        log.error('Consumer failed to connect');
        log.error(err);
        if (callback) callback(err);
      }
    });
    consumer
      .on('ready', function() {
        log.info('Consumer connected, subscribed to ' + topics.join(', '));
        consumer.subscribe(topics);
        consumer.consume();
        if (callback) callback();
      })
      .on('data', function(data) {
        let d = data.value.toString().replace(/"/g,''),
          dupeKey = d + '-' + data.key;
        if (!duplicateBuffer[dupeKey]) {
          emitter.emit('message', {
            data: d,
            type: data.key,
            topic: data.topic
          });

          duplicateBuffer[dupeKey] = setTimeout(() => {
            delete duplicateBuffer[dupeKey];
          }, DUPE_DELAY);
        } else {
          log.info('Ignoring duplicate event: ' + d + ' ' + data.type);
        }
      })
      .on('error', (err) => {
        log.error(err);
        emitter.emit('error', err);
      });

    return emitter;
  };

lib.getConsumer = (hubConfig, groupName, clientName) => {
  return new Kafka.KafkaConsumer({
      'group.id': groupName,
      'client.id': clientName,
      'metadata.broker.list': hubConfig.kafka_brokers_sasl.join(','),
      'security.protocol': 'sasl_ssl',
      'ssl.ca.location': '/etc/ssl/certs',
      'sasl.mechanisms': 'PLAIN',
      'sasl.username': hubConfig.user,
      'sasl.password': hubConfig.password,
      'api.version.request': true,
      'event_cb': true
    }, {});
};

Any suggestions? 有什么建议么?

There is no soft limit about the number of topics you can subscribe to with node-rdkafka Consumers. 您可以使用node-rdkafka Consumers订阅的主题数量没有软限制。

Just to verify, I tweaked our node-rdkafka sample ( https://github.com/ibm-messaging/message-hub-samples/tree/master/kafka-nodejs-console-sample ) to use 3 topics and it worked fine as expected. 只是为了验证,我调整了我们的node-rdkafka示例( https://github.com/ibm-messaging/message-hub-samples/tree/master/kafka-nodejs-console-sample )以使用3个主题,并且效果很好如预期的那样。

One thing I don't see in your producer code is a call to 我在生产者代码中看不到的一件事是对

  producer.setPollInterval(100);

Also to help investigate, I'd suggest to set: 为了帮助调查,我建议设置:

 'debug': 'all'

in both your clients configuration. 在两个客户端配置中。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM