简体   繁体   English

Kafka-Elasticsearch 接收器连接器不工作

[英]Kafka-Elasticsearch Sink Connector not working

I am trying to send data from Kafka to Elasticsearch.我正在尝试将数据从 Kafka 发送到 Elasticsearch。 I checked that my Kafka Broker is working because I can see the messages I produce to a topic is read by a Kafka Consumer.我检查了我的 Kafka Broker 是否正常工作,因为我可以看到我生成到某个主题的消息被 Kafka Consumer 读取。 However, when I try to connect Kafka to Elasticsearch I get the following error.但是,当我尝试将 Kafka 连接到 Elasticsearch 时,出现以下错误。

Command:命令:

connect-standalone etc/schema-registry/connect-avro-standalone.properties \
etc/kafka-connect-elasticsearch/quickstart-elasticsearch.properties

Error:错误:

ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectStandalone)
org.apache.kafka.connect.errors.ConnectException: Failed to connect to and describe Kafka cluster. Check worker's broker connection and security properties.
    at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:64)
    at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:45)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:83)
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.
    at org.apache.kafka.common.internals.KafkaFutureImpl.wrapAndThrow(KafkaFutureImpl.java:45)
    at org.apache.kafka.common.internals.KafkaFutureImpl.access$000(KafkaFutureImpl.java:32)
    at org.apache.kafka.common.internals.KafkaFutureImpl$SingleWaiter.await(KafkaFutureImpl.java:89)
    at org.apache.kafka.common.internals.KafkaFutureImpl.get(KafkaFutureImpl.java:260)
    at org.apache.kafka.connect.util.ConnectUtils.lookupKafkaClusterId(ConnectUtils.java:58)
    ... 2 more
Caused by: org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.

My Docker Compose File:我的 Docker 撰写文件:

version: '3'
services:
  zookeeper:
    container_name : zookeeper
    image: zookeeper
    ports:
     - 2181:2181
     - 2888:2888
     - 3888:3888

  kafka:
    container_name : kafka
    image: bitnami/kafka:1.0.0-r5
    depends_on:
      - zookeeper
    ports:
      - "9092:9092"
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_BROKER_ID: "42"
      KAFKA_ADVERTISED_HOST_NAME: "kafka"
      ALLOW_PLAINTEXT_LISTENER: "yes" 
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092

  elasticsearch:
    container_name : elasticsearch
    image:
      docker.elastic.co/elasticsearch/elasticsearch:7.8.0
    environment:
      - node.name=elasticsearch
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=elasticsearch
      - bootstrap.memory_lock=false
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - discovery.type=single-node
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - data99:/usr/share/elasticsearch/data
    ports:
      - 9200:9200
  kibana:
    container_name : kibana
    image: docker.elastic.co/kibana/kibana:7.8.0
    # environment:
      # - SERVER_NAME=Local kibana
      # - SERVER_HOST=0.0.0.0
      # - ELASTICSEARCH_URL=elasticsearch:9400
    ports:
      - "5601:5601"
    depends_on:
      - elasticsearch

  kafka-connect:
    container_name : kafka-connect
    image: confluentinc/cp-kafka-connect:5.3.1
    ports:
      - 8083:8083
    depends_on:
      - zookeeper
      - kafka
    volumes:
      - $PWD/connect-plugins:/connect-plugins
    environment:
      CONNECT_BOOTSTRAP_SERVERS: kafka:9092
      CONNECT_REST_ADVERTISED_HOST_NAME: "localhost"
      CONNECT_REST_PORT: 8083
      CONNECT_GROUP_ID: kafka-connect
      CONNECT_CONFIG_STORAGE_TOPIC: docker-kafka-connect-configs
      CONNECT_OFFSET_STORAGE_TOPIC: docker-kafka-connect-offsets
      CONNECT_STATUS_STORAGE_TOPIC: docker-kafka-connect-status
      CONNECT_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_KEY_CONVERTER-SCHEMAS_ENABLE: "false"
      CONNECT_VALUE_CONVERTER-SCHEMAS_ENABLE: "false"
      CONNECT_REST_ADVERTISED_HOST_NAME: "kafka-connect"
      CONNECT_LOG4J_ROOT_LOGLEVEL: "ERROR"
      CONNECT_LOG4J_LOGGERS: "org.apache.kafka.connect.runtime.rest=WARN,org.reflections=ERROR"
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: "1"
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: "1"
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: "1"
      CONNECT_TOPICS: "test-elasticsearch-sink"
      CONNECT_TYPE_NAME: "type.name=kafka-connect"
      CONNECT_PLUGIN_PATH: '/usr/share/java' #'/usr/share/java'
      # Interceptor config
      CONNECT_PRODUCER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringProducerInterceptor"
      CONNECT_CONSUMER_INTERCEPTOR_CLASSES: "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor"
      CLASSPATH: /usr/share/java/monitoring-interceptors/monitoring-interceptors-5.3.1.jar
      CONNECT_KAFKA_HEAP_OPTS: "-Xms256m -Xmx512m"

volumes:
  data99:
    driver: local

I checked some other questions and answers but couldn't come up with a solution to this problem.我检查了其他一些问题和答案,但无法找到解决此问题的方法。

Thanks in advance!提前致谢!

The Connect container starts Connect Distributed Server already. Connect 容器已经启动了 Connect Distributed Server。 You should use HTTP and JSON properties to configure the Elastic connector rather than exec into the container shell and issue connect-standalone commands which default to using a broker running in the container itself.您应该使用 HTTP 和 JSON 属性来配置弹性连接器,而不是在容器中执行 shell 并发出在容器中运行connect-standalone命令,这些命令默认使用代理。

Similarly, the Elastic quickstart file expects Elasticsearch running within the Connect container, by default同样,默认情况下,Elastic 快速入门文件要求 Elasticsearch 在 Connect 容器中运行

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Kafka Connect Elasticsearch 带有自定义路由的接收器连接器 - Kafka Connect Elasticsearch Sink Connector with custom _routing Kafka Elasticsearch Sink 连接器:连接错误 - Kafka Elasticsearch Sink Connector: Connection Error 自定义Kafka Connect-ElasticSearch接收器连接器 - Customize Kafka Connect - ElasticSearch Sink Connector Logstash /不是logtash用于kafka-elasticsearch集成? - Logstash/not logstash for kafka-elasticsearch integration? 无法使用 elasticsearch sink 连接器(kafka-connect) - Unable to use elasticsearch sink connector (kafka-connect) 如何使用 Kafka Elasticsearch Sink Connector 写入多个不同的 Elasticsearch 集群 - How to write to multiple distinct Elasticsearch clusters using the Kafka Elasticsearch Sink Connector 如何运行 mongo-kafka 连接器作为 kafka 的源并将其与 logstash 输入集成以使用 elasticsearch 作为接收器? - How to run the mongo-kafka connector as a source for kafka and integrate that with logstash input to use elasticsearch as a sink? Kafka Sink 连接器弹性云 - Kafka Sink Connector Elastic Cloud 无法使用Confluent Elasticsearch Sink连接器将Kafka主题数据转换为结构化JSON - Unable to convert Kafka topic data into structured JSON with Confluent Elasticsearch sink connector kafka 连接elasticsearch 连接器(接收器)数据移动(几乎)实时 - kafka connect elasticsearch connector (sink) data move in (almost) real-time
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM