简体   繁体   English

如何配置 Kafka 集群与 Elastic Search 集群一起工作?

[英]How to configure the Kafka Cluster to work with Elastic Search Cluster?

I have to build a log-cluster and monitoring cluster ( For high-availability ) like this topology.我必须构建一个像这种拓扑结构的日志集群和监控集群(为了高可用性)。 I'm wondering to know how to config those log-shippers clusters.我想知道如何配置这些日志托运人集群。 ( I have 2 Topo in the Image) (我在图像中有 2 个地形)

  1. If I use Kafka with FileBeat in Kafka Cluster, Will Elastic Search receive duplication data because Kafka has replicas in data?如果我在 Kafka 集群中使用带有 FileBeat 的 Kafka,Elastic Search 会收到重复数据,因为 Kafka 在数据中有副本吗?

  2. If I use Logstash (In Elastic Search Cluster) for getting logs from Kafka Cluster, how the config should be because I think that Logstash will not know where to read the log efficiency on Kafka Cluster.如果我使用 Logstash(在弹性搜索集群中)从 Kafka 集群获取日志,配置应该如何,因为我认为 Logstash 不知道从哪里读取 Kafka 集群上的日志效率。

Cluster topology集群拓扑

Thanks for reading.谢谢阅读。 If you have any idea, please discuss with me ^^!如果您有任何想法,请与我讨论^^!

As i see both configurations are compatible with Kafka, you can use filebeat, logstash or mixed them in consumer and producer stages!正如我看到的两种配置都与 Kafka 兼容,您可以在消费者和生产者阶段使用 filebeat、logstash 或混合它们!

IMHO all depends about your needs, ie: sometimes we use some filters to rich the data before ingest to kafka (producer stage), or before index the data to elastic (consumer stage), in this case is better work with logsatsh, because is easier using filters than in filebeat恕我直言,一切都取决于您的需求,即:有时我们在摄取到 kafka(生产者阶段)之前或在将数据索引到弹性(消费者阶段)之前使用一些过滤器来丰富数据,在这种情况下最好使用 logsatsh,因为使用过滤器比在 filebeat 中更容易

But if you want to play with raw data, maybe filebeat is betther, because the agent is lighter.但是如果你想玩原始数据,也许 filebeat 更好,因为代理更轻。

About your questions:关于您的问题:

  1. Kafka has the data replicted, but for HA propouses, you only read one time the data with the same consumer group Kafka 有复制的数据,但是对于 HA 提议,您只能读取一次具有相同消费组的数据
  2. For read the log from kafka with logstash, you can use the logstash input plugin for kafka, is easy and works fine!对于使用logstash从kafka读取日志,您可以使用kafka的logstash输入插件,很容易并且工作正常!

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html https://www.elastic.co/guide/en/logstash/current/plugins-inputs-kafka.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM