简体   繁体   English

Kafka-Connect vs Filebeat和Logstash

[英]Kafka-Connect vs Filebeat & Logstash

I'm looking to consume from Kafka and save data into Hadoop and Elasticsearch. 我希望从Kafka消费数据并将数据保存到Hadoop和Elasticsearch中。 I've seen 2 ways of doing this currently: using Filebeat to consume from Kafka and send it to ES and using Kafka-Connect framework. 我目前已经看到了两种方法:使用Filebeat从Kafka消费并将其发送到ES,以及使用Kafka-Connect框架。 There is a Kafka-Connect-HDFS and Kafka-Connect-Elasticsearch module. 有一个Kafka-Connect-HDFS和Kafka-Connect-Elasticsearch模块。

I'm not sure which one to use to send streaming data. 我不确定要使用哪个发送流数据。 Though I think that if I want at some point to take data from Kafka and place it into Cassandra I can use a Kafka-Connect module for that but no such feature exists for Filebeat. 虽然我认为如果我想在某个时候从Kafka中获取数据并将其放入Cassandra中,我可以为此使用Kafka-Connect模块,但是Filebeat没有这样的功能。

Kafka Connect can handle streaming data and is a bit more flexible. Kafka Connect可以处理流数据并且更加灵活。 If you are just going to elastic, Filebeat is a clean integration for log sources. 如果您只是想弹性一点,Filebeat是一个干净的日志源集成。 However, if you are going from Kafka to a number of different sinks, Kafka Connect is probably what you want. 但是,如果您要从Kafka迁移到许多不同的接收器,则可能需要Kafka Connect。 I'd recommend checking out the connector hub to see some examples of open source connectors at your disposal currently http://www.confluent.io/product/connectors/ 我建议您查看连接器中枢,以查看当前可使用的一些开源连接器示例, 网址为http://www.confluent.io/product/connectors/

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM