简体   繁体   English

使用 Elasticsearch Sink Connector for Kafka 从两个由下划线分隔的值字段创建文档 ID

[英]Create document id from two value fields separated by underscore using Elasticsearch Sink Connector for Kafka

I am trying to load records from a Kafka topic to Elasticsearch using the Elasticsearch Sink Connector, but I'm struggling to construct the document ids the way I would like them.我正在尝试使用 Elasticsearch 接收器连接器将来自 Kafka 主题的记录加载到 Elasticsearch,但我正在努力按照我想要的方式构建文档 ID。 I would like the document id that is written to Elasticsearch to be a composition of two values separated by underscore from my kafka topic's message.我希望写入 Elasticsearch 的文档 ID 是我的 kafka 主题消息中用下划线分隔的两个值的组合。

For example:例如:

My Kafka topic value has the following Avro schema:我的 Kafka 主题值具有以下 Avro 架构:

{
  "type": "record",
  "name": "SampleValue",
  "namespace": "com.abc.test",
  "fields": [
    {
      "name": "value1",
      "type": [
        "null",
        {
          "type": "int",
          "java-class": "java.lang.Integer"
        }
      ],
      "default": null
    },
    {
      "name": "value2",
      "type": [
        "null",
        {
          "type": "int",
          "java-class": "java.lang.Integer"
        }
      ],
      "default": null
    },
    {
      "name": "otherValue",
      "type": [
        "null",
        {
          "type": "int",
          "java-class": "java.lang.Integer"
        }
      ],
      "default": null
    }
  ]
}

I would like the document id that is written to Elasticsearch to be the combined values of value1 and value2 separated by an underscore.我希望写入 Elasticsearch 的文档 ID 是由下划线分隔的value1value2的组合值。 If the given value in avro looked like如果 avro 中的给定值看起来像

{"value1": {"int": 123}, "value2": {"int": 456}, "value3": {"int": 0}}

then I would like the document id for Elasticsearch to be 123_456 .那么我希望123_456的文档 ID 为 123_456 。

I can't figure out the correct way to chain transformations in my connector config to create a key that is composed of two values separated by an underscore.我无法找出在连接器配置中链接转换以创建由下划线分隔的两个值组成的键的正确方法。

I don't think there is a Single Message Transform out of the box that will do what you want.我认为没有开箱即用的单一消息转换可以满足您的需求。

You can either write your own, using the Transform API, or you can use a stream processor such as Kafka Streams or ksqlDB.您可以使用 Transform API 自己编写,也可以使用 stream 处理器,例如 Kafka Streams 或 ksqlDB。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Kafka Elasticsearch Sink 连接器:连接错误 - Kafka Elasticsearch Sink Connector: Connection Error Kafka Connect Elasticsearch 带有自定义路由的接收器连接器 - Kafka Connect Elasticsearch Sink Connector with custom _routing Kafka-Elasticsearch 接收器连接器不工作 - Kafka-Elasticsearch Sink Connector not working 自定义Kafka Connect-ElasticSearch接收器连接器 - Customize Kafka Connect - ElasticSearch Sink Connector Kafka 连接器到 map 主题键作为 ElasticSearch 中的文档 ID - Kafka Connector to map topic key as document id in ElasticSearch 如何使用 Kafka Elasticsearch Sink Connector 写入多个不同的 Elasticsearch 集群 - How to write to multiple distinct Elasticsearch clusters using the Kafka Elasticsearch Sink Connector 无法使用 elasticsearch sink 连接器(kafka-connect) - Unable to use elasticsearch sink connector (kafka-connect) Kafka连接ElasticSearch接收器-使用if-else块提取和转换不同主题的字段 - Kafka connect ElasticSearch sink - using if-else blocks to extract and transform fields for different topics 尝试从主题创建索引时,Elasticsearch sink 连接器抛出 403 禁止异常 - Elasticsearch sink connector throws 403 forbidden exception when trying to create indices from topics 我在哪里可以找到从kafka发送到elasticSearch的数据? 知道我正在使用ElasticSearch水槽吗? - where can I find the data sent from kafka to elasticSearch ? knowing that I using a ElasticSearch sink?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM