简体   繁体   English

kafka 连接elasticsearch 连接器(接收器)数据移动(几乎)实时

[英]kafka connect elasticsearch connector (sink) data move in (almost) real-time

interested in moving data from kafka to elasticsearch.对将数据从 kafka 移动到 elasticsearch 感兴趣。 so i have setup kafka connect elasticsearch connector .所以我已经设置了kafka connect elasticsearch 连接器

although i reviewed all the documentation and the configuration , i have yet found how to configure the connector to move data from kafka to elasticsearch in (near) real-time.尽管我查看了所有文档和配置,但我还没有找到如何配置连接器以(近)实时地将数据从 kafka 移动到 elasticsearch。

that is, making the connector consume message from the (kafka) topic each second and write them into elasticsearch, mimking streaming of messages from kafka to elasticsearch.也就是说,让连接器每秒消耗来自(kafka)主题的消息并将它们写入elasticsearch,模拟从kafka到elasticsearch的消息流。

this is the current configuration for the connector:这是连接器的当前配置:

{
  "name": "elasticsearch-sink",
  "config": {
    "connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
    "tasks.max": "1",
    "topics": "foo",
    "key.ignore": "true",
    "schema.ignore": "true",
    "connection.url": "http://elasticsearch:9200",
    "type.name": "kafka-connect",
    "name": "elasticsearch-sink"
  }
}

how does the connector need to be configured so it'll "stream" message from kafka to elasticsearch?需要如何配置连接器才能将消息从 kafka “流式传输”到 elasticsearch?

Since Kafka Sink Connector is based on consumer , you can control polling interval via consumer properties (eg max.poll.interval.ms , max.poll.records ).由于Kafka Sink Connector基于consumer ,您可以通过消费者属性(例如max.poll.interval.msmax.poll.records )控制轮询间隔。 To configure, just add consumer property with prefix consumer.要配置,只需添加带有前缀consumer.消费者属性consumer. to your connector config:到您的连接器配置:

consumer.max.poll.records=1消费者.max.poll.records=1

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM