[英]Auto sinking of topics being created in kafka to elasticsearch
I have topics being created in kafka (test1, test2, test3) and I want to sink them to elastic at creation time. 我有在kafka中创建的主题(test1,test2,test3),我想在创建时使其变得弹性。 I tried topics.regex but it only creates indices for topics already existing.
我尝试了topic.regex,但它仅为已经存在的主题创建索引。 How can I sink a new topic into an index when it gets created dynamically?
动态创建新主题时,如何将其存储到索引中?
Here is the connector config that I am using for kafka-sink: 这是我用于kafka-sink的连接器配置:
{
"name": "elastic-sink-test-regex",
"config": {
"connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
"tasks.max": "1",
"topics.regex": "test[0-9]+",
"type.name": "kafka-connect",
"connection.url": "http://192.168.0.188:9200",
"key.ignore": "true",
"schema.ignore": "true",
"schema.enable": "false",
"batch.size": "100",
"flush.timeout.ms": "100000",
"max.buffered.records": "10000",
"max.retries": "10",
"retry.backoff.ms": "1000",
"max.in.flight.requests": "3",
"is.timebased.indexed": "False",
"time.index": "at"
}
}
A sink connector won't read new topics till this connector is restarted (or a scheduled rebalance occurred). 接收器连接器在重新启动此连接器(或发生计划的重新平衡)之前不会读取新主题。 You can run a Kafka Stream that reads messages from new topics and put them into a result-like topic.
您可以运行Kafka Stream来读取新主题中的消息,并将其放入类似结果的主题中。 A Sink Connector reads from the result-like topic.
接收器连接器读取类似结果的主题。
To save a "message - topic" matching you can use Kafka Record Headers. 要保存“邮件-主题”匹配项,可以使用Kafka记录标题。
Make sure it meets your requirements!
确保它满足您的要求!
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.