简体   繁体   中英

Kafka Connector to map topic key as document id in ElasticSearch

I am trying to map the Kafka topic key as the document id while indexing to elastic search using Kafka sink connector. But I am getting an exception that key is null

As mentioned in the below example, I want the key which is "AKA-25" as the document id but the transformations that I am doing are failing. It seems to be working fine if I take any other field from the value like I tried with "empId" that works fine. But my requirement is to map the key.

Can someone please suggest?

Here is the Connector config:

{
  "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
  "type.name": "doc",
  "tasks.max": "1",
  "key.ignore": "false",
  "schema.ignore": "true",
  "key.converter.schemas.enable": "false",
  "value.converter.schemas.enable": "false",
  "connection.url": "http://elasticsearch:9200",
  "topics": "emp_change",
  "value.converter": "org.apache.kafka.connect.json.JsonConverter",
  "key.converter": "org.apache.kafka.connect.json.JsonConverter",
  "transforms": "addSuffix,InsertKey,extractKey,dropPrefix",
  "transforms.extractKey.field": "key",
  "transforms.extractKey.type": "org.apache.kafka.connect.transforms.ExtractField$Key",
  "transforms.InsertKey.fields": "key",
  "transforms.InsertKey.type": "org.apache.kafka.connect.transforms.ValueToKey"
  "transforms.dropPrefix.regex": "emp_change(.*)",
  "transforms.dropPrefix.type": "org.apache.kafka.connect.transforms.RegexRouter", 
  "transforms.dropPrefix.replacement": "$1", 
  "transforms.addSuffix.replacement": "employee",
  "transforms.addSuffix.regex": ".*",
  "transforms.addSuffix.type": "org.apache.kafka.connect.transforms.RegexRouter",
  "transforms.createKey.fields": "key"
}

Kafka topic message looks like this:

{
    "topic": "EmpChange",
    "key": "AKA-25",
    "value": {
      "eventType": "EMP_CHG",
      "empId": 1001,
      "empName": "AKASH"
    },
    "partition": 0,
    "offset": 0
  }

Here is the exception that I am getting:

org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:560)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:321)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: Key is used as document id and can not be null.
    at io.confluent.connect.elasticsearch.DataConverter.convertKey(DataConverter.java:79)
    at io.confluent.connect.elasticsearch.DataConverter.convertRecord(DataConverter.java:160)
    at io.confluent.connect.elasticsearch.ElasticsearchWriter.tryWriteRecord(ElasticsearchWriter.java:285)
    at io.confluent.connect.elasticsearch.ElasticsearchWriter.write(ElasticsearchWriter.java:270)
    at io.confluent.connect.elasticsearch.ElasticsearchSinkTask.put(ElasticsearchSinkTask.java:169)
    at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:538)
    ... 10 more

I think you've got a mistake in your config. As shown in your data you've got a key ( "key": "AKA-25", ) to your message:

{
    "topic": "EmpChange",
    "key": "AKA-25",
    "value": {
      "eventType": "EMP_CHG",
      "empId": 1001,
      "empName": "AKASH"
    },
    "partition": 0,
    "offset": 0
  }

However, you have a Single Message Transform which instructs Kafka Connect to copy the value of field key in the value of the message into the key of the message

  "transforms.InsertKey.fields": "key",
  "transforms.InsertKey.type": "org.apache.kafka.connect.transforms.ValueToKey"

But since key is not a field in the value of the message, the transform is going to insert a NULL in the Kafka message key, and thus the error that you get

Caused by: org.apache.kafka.connect.errors.ConnectException: Key is used as document id and can not be null.

The correct config to use should be this:

{
  "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
  "type.name": "doc",
  "tasks.max": "1",
  "key.ignore": "false",
  "schema.ignore": "true",
  "value.converter.schemas.enable": "false",
  "connection.url": "http://elasticsearch:9200",
  "topics": "PPAP_C01_ApexUserChange",
  "value.converter": "org.apache.kafka.connect.json.JsonConverter",
  "key.converter": "org.apache.kafka.connect.storage.StringConverter",
  "transforms": "addSuffix,dropPrefix",
  "transforms.dropPrefix.regex": "PPAP_C01_ApexUserChange(.*)",
  "transforms.dropPrefix.type": "org.apache.kafka.connect.transforms.RegexRouter", 
  "transforms.dropPrefix.replacement": "$1", 
  "transforms.addSuffix.replacement": "users1",
  "transforms.addSuffix.regex": ".*",
  "transforms.addSuffix.type": "org.apache.kafka.connect.transforms.RegexRouter"
}

The delta between what you had and this updated config is:

  • Remove ValueToKey and ExtractField$Key transforms
  • Remove orphan createKey transform config
  • Change key converter to org.apache.kafka.connect.storage.StringConverter and remove corresponding schemas.enable

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM