简体   繁体   English

如何在多个Elasticsearch索引中聚合Kibana信息?

[英]How to aggregate in Kibana information from multiple Elasticsearch indexes?

We are setting up logs from several related applications so the log events are imported into Elasticsearch (via Logstash). 我们正在从几个相关应用程序设置日志,以便将日志事件导入Elasticsearch(通过Logstash)。 It was straightforward create Kibana dashboards to visualize log indexes for each application, but since the applications are related and its activities belong to the same pipeline, it would be great to build a dashboard that would show aggregated information, collected from different applications. 可以直接创建Kibana仪表板来显示每个应用程序的日志索引,但由于应用程序是相关的,并且其活动属于同一个管道,因此构建一个显示从不同应用程序收集的聚合信息的仪表板会很棒。 Such dashboard would be especially useful to track failures and performance problems. 这样的仪表板对于跟踪故障和性能问题特别有用。

Right now I can see three main ways to implement aggregated dashboard: 现在我可以看到实现聚合仪表板的三种主要方法:

  1. Keep separate application logs and configure Kibana dashboard that would consume information from different applications. 保留单独的应用程序日志并配置将使用来自不同应用程序的信息的Kibana仪表板。 I am afraid this can be a challenging task, I am not even sure Kibana fully supports it. 我担心这可能是一项具有挑战性的任务,我甚至不确定Kibana是否完全支持它。
  2. Revise application logging so they will all log to the same index. 修改应用程序日志记录,以便它们都记录到相同的索引。 What I dislike about this is that log event structure must be then unified across applications, and they are built by different people in different languages. 我不喜欢的是,日志事件结构必须在应用程序之间统一,并且由不同语言的不同人员构建。 I've lost my faith to centralized control over such low level details like logging. 我已经失去了对集中控制日志等低级细节的信心。
  3. Keep applications log and corresponding Elastichsearch indexes as they are now, but set up a new index which will contain aggregate information. 保持应用程序日志和相应的Elastichsearch索引,但是设置一个包含聚合信息的新索引。 This article describes how to configure Elasticsearch to dump it's logs to Logstash which would then insert them back into Elasticsearch for searching. 本文介绍如何配置Elasticsearch将其日志转储到Logstash,然后将其插入Elasticsearch进行搜索。 At first glance this approach may look surprising: why would you need to re-insert log data once again into the same database? 乍一看,这种方法可能看起来令人惊讶:为什么需要再次将日志数据重新插入到同一个数据库中? It's another index, it adds overhead, uses more space etc. But it gives the opportunity to set up the index in a way that will be suitable for a aggregated Kibana dashboard. 它是另一个索引,它增加了开销,使用了更多空间等。但它提供了以适合聚合Kibana仪表板的方式设置索引的机会。

I wonder if someone has gone through a similar dilemma and can share their experience. 我想知道是否有人经历过类似的困境,可以分享他们的经历。

I believe you can just set the Default Index to _all if you're not planning to use timestamped indices. 我相信如果你不打算使用带时间戳的索引,你可以将默认索引设置为_all

Using menus, go to Configure, click the Index tab, and set Timestamping to "none" and Default Index to _all . 使用菜单,转到“配置”,单击“索引”选项卡,然后将“时间戳”设置为“无”,将“默认索引”设置为_all The JSON schema would end up containing something like this: JSON模式最终将包含以下内容:

  "index": {
    "interval": "none",
    "pattern": "[logstash-]YYYY.MM.DD",
    "default": "_all",
    "warm_fields": false
  },

If you need timestamped indices, you would need to choose the approriate interval and enter a comma-separated list of the indices, each specified in the proper format. 如果需要带时间戳的索引,则需要选择适当的间隔并输入以逗号分隔的索引列表,每个索引都以正确的格式指定。

We faced the same problem, but in different perspective. 我们面临同样的问题,但从不同的角度来看。

I needed to get data from 2 indices in Kibana. 我需要从Kibana获得2个指数的数据。 Our data structure is the same in both indices. 我们的数据结构在两个指数中都是相同的。

So, I added second index manually (section Settings->Objects): 所以,我手动添加了第二个索引(部分设置 - >对象):

{
  "index": [
    "index_one",
    "index_two"
  ],
 ...
}

It helped me to get data from those indices that I need. 它帮助我从我需要的那些指数中获取数据。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在不使用 Kibana 的情况下直接从 elasticsearch 读取日志 - How to read logs directly from elasticsearch without using Kibana 如何从 kibana 控制台按排序顺序获取 elasticsearch 索引? - How to get elasticsearch index in sorted order from kibana console? 我可以为多个 k8 集群使用单个 elasticsearch/kibana 吗? - Can I use a single elasticsearch/kibana for multiple k8 clusters? 在kibana / elasticsearch中将具有公共ID的多个文档视为一个 - Treat multiple documents with a common id to be treated as one in kibana/elasticsearch kibana 中的 Elasticsearch 查询 - Elasticsearch queries in kibana 如何在一段时间内在Kibana / Elasticsearch中获得每个唯一IP地址的平均点击数? - How can I get an average number of hits per unique IP address over a period of time in Kibana / Elasticsearch? 将JSON文件导入Logstash + Elasticsearch + Kibana - Import JSON Files into Logstash + Elasticsearch + Kibana Kibana3 CloudWatch仪表板未连接到Elasticsearch - Kibana3 cloudwatch dashboard not connecting to Elasticsearch Kibana-如何从现有Kubernetes日志中提取字段 - Kibana - How to extract fields from existing Kubernetes logs 如何从Web站点使用Elastic&Kibana收集日志和收集日志 - How to collect log and gathering log with elastic & kibana from WEBSITE
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM