繁体   English   中英

Kafka 连接器到 map 主题键作为 ElasticSearch 中的文档 ID

[英]Kafka Connector to map topic key as document id in ElasticSearch

我正在尝试 map Kafka 主题键作为文档 ID,同时使用 Kafka 接收器连接器索引到弹性搜索。 但我得到一个例外,关键是 null

如以下示例中所述,我希望将“AKA-25”作为文档 ID,但我正在进行的转换失败。 如果我从值中获取任何其他字段,就像我尝试使用“empId”一样,它似乎工作正常。 但我的要求是 map 是关键。

有人可以建议吗?

这是连接器配置:

{
  "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
  "type.name": "doc",
  "tasks.max": "1",
  "key.ignore": "false",
  "schema.ignore": "true",
  "key.converter.schemas.enable": "false",
  "value.converter.schemas.enable": "false",
  "connection.url": "http://elasticsearch:9200",
  "topics": "emp_change",
  "value.converter": "org.apache.kafka.connect.json.JsonConverter",
  "key.converter": "org.apache.kafka.connect.json.JsonConverter",
  "transforms": "addSuffix,InsertKey,extractKey,dropPrefix",
  "transforms.extractKey.field": "key",
  "transforms.extractKey.type": "org.apache.kafka.connect.transforms.ExtractField$Key",
  "transforms.InsertKey.fields": "key",
  "transforms.InsertKey.type": "org.apache.kafka.connect.transforms.ValueToKey"
  "transforms.dropPrefix.regex": "emp_change(.*)",
  "transforms.dropPrefix.type": "org.apache.kafka.connect.transforms.RegexRouter", 
  "transforms.dropPrefix.replacement": "$1", 
  "transforms.addSuffix.replacement": "employee",
  "transforms.addSuffix.regex": ".*",
  "transforms.addSuffix.type": "org.apache.kafka.connect.transforms.RegexRouter",
  "transforms.createKey.fields": "key"
}

Kafka 主题消息如下所示:

{
    "topic": "EmpChange",
    "key": "AKA-25",
    "value": {
      "eventType": "EMP_CHG",
      "empId": 1001,
      "empName": "AKASH"
    },
    "partition": 0,
    "offset": 0
  }

这是我得到的例外:

org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:560)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: Key is used as document id and can not be null.
    at io.confluent.connect.elasticsearch.DataConverter.convertKey(DataConverter.java:79)
    at io.confluent.connect.elasticsearch.DataConverter.convertRecord(DataConverter.java:160)
    at io.confluent.connect.elasticsearch.ElasticsearchWriter.tryWriteRecord(ElasticsearchWriter.java:285)
    at io.confluent.connect.elasticsearch.ElasticsearchWriter.write(ElasticsearchWriter.java:270)
    at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.put(ElasticsearchSinkTask.java:169)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:538)
    ... 10 more

我认为你的配置有错误。 如您的数据所示,您的消息有一个密钥( "key": "AKA-25", ):

{
    "topic": "EmpChange",
    "key": "AKA-25",
    "value": {
      "eventType": "EMP_CHG",
      "empId": 1001,
      "empName": "AKASH"
    },
    "partition": 0,
    "offset": 0
  }

但是,您有一个单消息转换,它指示 Kafka Connect 将消息值中的字段key复制到消息的键中

  "transforms.InsertKey.fields": "key",
  "transforms.InsertKey.type": "org.apache.kafka.connect.transforms.ValueToKey"

但是由于key不是消息value中的字段,因此转换将在 Kafka 消息键中插入 NULL,因此您得到的错误

Caused by: org.apache.kafka.connect.errors.ConnectException: Key is used as document id and can not be null.

要使用的正确配置应该是这样的:

{
  "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
  "type.name": "doc",
  "tasks.max": "1",
  "key.ignore": "false",
  "schema.ignore": "true",
  "value.converter.schemas.enable": "false",
  "connection.url": "http://elasticsearch:9200",
  "topics": "PPAP_C01_ApexUserChange",
  "value.converter": "org.apache.kafka.connect.json.JsonConverter",
  "key.converter": "org.apache.kafka.connect.storage.StringConverter",
  "transforms": "addSuffix,dropPrefix",
  "transforms.dropPrefix.regex": "PPAP_C01_ApexUserChange(.*)",
  "transforms.dropPrefix.type": "org.apache.kafka.connect.transforms.RegexRouter", 
  "transforms.dropPrefix.replacement": "$1", 
  "transforms.addSuffix.replacement": "users1",
  "transforms.addSuffix.regex": ".*",
  "transforms.addSuffix.type": "org.apache.kafka.connect.transforms.RegexRouter"
}

你所拥有的和这个更新的配置之间的差异是:

  • 移除ValueToKeyExtractField$Key转换
  • 删除孤立createKey转换配置
  • 将密钥转换器更改为org.apache.kafka.connect.storage.StringConverter并删除对应schemas.enable

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM