简体   繁体   English

无法使用 elasticsearch sink 连接器(kafka-connect)

[英]Unable to use elasticsearch sink connector (kafka-connect)

I'm currently trying to start an elasticsearch sink connector on a kafka-connect cluster (distributed mode) This cluster is deployed in kubernetes using the helm charts provided by confluent with some tweaks in it.我目前正在尝试在 kafka-connect 集群(分布式模式)上启动一个 elasticsearch sink 连接器这个集群使用 confluent 提供的 helm charts 部署在 kubernetes 中,并在其中进行了一些调整。 Here is relevants parts :这是相关部分:

For values.yaml对于 values.yaml

configurationOverrides:
  "plugin.path": "/usr/share/java,/usr/share/confluent-hub-components"
  "key.converter": "org.apache.kafka.connect.storage.StringConverter"
  "value.converter": "org.apache.kafka.connect.json.JsonConverter"
  "key.converter.schemas.enable": "false"
  "value.converter.schemas.enable": "false"
  "internal.key.converter": "org.apache.kafka.connect.json.JsonConverter"
  "internal.value.converter": "org.apache.kafka.connect.json.JsonConverter"
  "config.storage.replication.factor": "3"
  "offset.storage.replication.factor": "3"
  "status.storage.replication.factor": "3"
  "security.protocol": SASL_SSL
  "sasl.mechanism": SCRAM-SHA-256

And for the kube cluster part :对于 kube 集群部分:

releases:
  - name: kafka-connect
    tillerless: true
    tillerNamespace: qa3-search
    chart: ../charts/cp-kafka-connect
    namespace: qa3-search
    values:
      - replicaCount: 2
      - configurationOverrides:
          config.storage.topic: kafkaconnectKApp_connect-config_private_json
          offset.storage.topic: kafkaconnectKApp_connect-offsets_private_json
          status.storage.topic: kafkaconnectKApp_connect-statuses_private_json
          connect.producer.client_id: "connect-worker-producerID"
          groupId: "kafka-connect-group-ID"
          log4j.root.loglevel: "INFO"
          bootstrap_servers: "SASL_SSL://SOME_ACCESSIBLE_URL:9094"
          client.security.protocol: SASL_SSL
          client.sasl.mechanism: SCRAM-SHA-256
      - prometheus:
          jmx:
            enabled: false
      - ingress:
          enabled: true
          hosts:
            - host: kafka-connect.qa3.k8s.XXX.lan
              paths:
                - /
      - cp-schema-registry:
          url: "https://SOME_ACCESSIBLE_URL"

Then I am loading the elasticsearch sink connector as such :然后我正在加载 elasticsearch sink 连接器:

curl -X POST -H 'Content-Type: application/json' http://kafka-connect.qa3.k8s.XXX.lan/connectors -d '{
"name": "similarads3",
"config": {
"connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
"consumer.interceptor.classes": "io.confluent.monitoring.clients.interceptor.MonitoringConsumerInterceptor",
"topics": "SOME_TOPIC_THAT_EXIST",
"topic.index.map": "SOME_TOPIC_THAT_EXIST:test_similar3",
"connection.url": "http://vqa38:9200",
"batch.size": 1,
"type.name": "similads",
"key.ignore": true,
"errors.log.enable": true,
"errors.log.include.messages": true,
"value.converter": "io.confluent.connect.avro.AvroConverter",
"value.converter.schema.registry.url": "SOME_ACCESSIBLE_URL",
"schema.ignore": true
}
}' -vvv

More over I'm loading user and password for brokers auth via environment variable, and I'm pretty sure it is connected with rights ACL...此外,我正在通过环境变量加载代理身份验证的用户和密码,而且我很确定它与权限 ACL 相关...

What is troubling me, is that there is no index creation when the connector starts, and there is no error what so ever in kafka-connect's logs... And it says everything has started令我烦恼的是,连接器启动时没有创建索引,并且 kafka-connect 的日志中没有任何错误......它说一切都已经开始

Starting connectors and tasks using config offset 68

When running a curl on /connectors/similarads3/status, everything is running, without errors.在 /connectors/similarads3/status 上运行 curl 时,一切都在运行,没有错误。

So it seems like I overlooked something, but I can't figure out what is missing.所以似乎我忽略了一些东西,但我无法弄清楚缺少什么。 When I check consumers lag on this particular topics, it seems like no messages where consumed ever.当我检查消费者在这个特定主题上的滞后时,似乎从来没有消费过的消息。

If there is not enough information, I'm able to provide more.如果没有足够的信息,我可以提供更多。 Does someone have an idea ?有人有想法吗?

EDIT : I should have mentioned that I tried to configure it with a topic that does not exist : again no error in logs.编辑:我应该提到我试图用一个不存在的主题配置它:日志中再次没有错误。 (I don't know how to interpret this) (我不知道如何解释这个)

EDIT 2 : This issue is solved Actually we found the issue and it appears that i did overlooked something: in order to read from a topic protected by ACLs rights, you have to provide the SASL configuration for both the connector and the sink consumer.编辑 2:这个问题已解决实际上我们发现了这个问题,看来我确实忽略了一些东西:为了读取受 ACL 权限保护的主题,您必须为连接器和接收器使用者提供 SASL 配置。 So just duplicating the configuration prefixed with consumer.所以只需复制以consumer.为前缀的配置consumer. fixed this problem.修复了这个问题。 However I'm still surprised that no logs can point to this.但是我仍然很惊讶没有日志可以指出这一点。

We had issues trying to use the topic.index.map property.我们在尝试使用 topic.index.map 属性时遇到了问题。 Even if you got it working there is a note in the docs that it is deprecated.即使你让它工作了,文档中也会有一条说明它已被弃用。

topic.index.map
This option is now deprecated. A future version may remove it completely. Please use single message transforms, such as RegexRouter, to map topic names to index names.

I'd try using the RegexRouter to accomplish this instead.我会尝试使用RegexRouter来完成此操作。

"transforms": "renameTopicToIndex",
"transforms.renameTopicToIndex.type": "org.apache.kafka.connect.transforms.RegexRouter"
"transforms.renameTopicToIndex.regex": ".*"
"transforms.renameTopicToIndex.replacement": "test_similar3"

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在 kafka-connect Confluent 平台的 elasticsearch sink 连接器配置中使用 ca cert? - How to use ca cert in elasticsearch sink connector configuration for kafka-connect confluent platform? 自定义Kafka Connect-ElasticSearch接收器连接器 - Customize Kafka Connect - ElasticSearch Sink Connector Kafka Connect Elasticsearch 带有自定义路由的接收器连接器 - Kafka Connect Elasticsearch Sink Connector with custom _routing 在 kafka-connect 接收器中提取字段和解析 JSON - ExtractField and Parse JSON in kafka-connect sink Kafka Elasticsearch Sink 连接器:连接错误 - Kafka Elasticsearch Sink Connector: Connection Error Kafka-Elasticsearch 接收器连接器不工作 - Kafka-Elasticsearch Sink Connector not working 如何运行 mongo-kafka 连接器作为 kafka 的源并将其与 logstash 输入集成以使用 elasticsearch 作为接收器? - How to run the mongo-kafka connector as a source for kafka and integrate that with logstash input to use elasticsearch as a sink? kafka 连接elasticsearch 连接器(接收器)数据移动(几乎)实时 - kafka connect elasticsearch connector (sink) data move in (almost) real-time Kafka-connect elasticsearch用于索引的自动小写主题名称 - Kafka-connect elasticsearch auto-lowercase topic name for for index 无法使用Confluent Elasticsearch Sink连接器将Kafka主题数据转换为结构化JSON - Unable to convert Kafka topic data into structured JSON with Confluent Elasticsearch sink connector
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM