简体   繁体   English

将Sensu处理程序用于Elasticsearch时,日志将被覆盖

[英]When using the Sensu Handler for Elasticsearch, the logs are overwritten

I have found a sensu handler on GitHub for shipping data to elasticsearch. 我在GitHub上找到了一个sensu处理程序,用于将数据传送到elasticsearch。 It can be seen here https://github.com/m4ce/sensu-handlers-elasticsearch . 可以在这里https://github.com/m4ce/sensu-handlers-elasticsearch看到。 I have configured this handler to send the keepalive events to Elasticsearch. 我已配置此处理程序以将keepalive事件发送到Elasticsearch。 When a critical event is generated, the log appears in Elasticsearch. 生成严重事件时,日志显示在Elasticsearch中。 However, when this event is resolved, the critical log is overwritten and replaced with the resolved log .I need to track both the critical and resolved logs so I can't have any logs being overwritten. 但是,解决此事件后,关键日志将被覆盖并替换为已解决的日志。我需要同时跟踪关键和已解决的日志,因此我无法覆盖任何日志。 Has anyone had the same problem, or does anyone know how to resolve this problem? 是否有人遇到过同样的问题,或者有人知道如何解决此问题?

Thanks, 谢谢,

AM 上午

The URI that the Sensu handler is POSTing to is: Sensu处理程序要发布到的URI是:

<elasticsearch_url>/<index>/<type>/<document_id>

(you can see this on line 53 of the handler where the URI is built) (您可以在构建URI的处理程序的第53行上看到这一点)

Where: 哪里:

  • <elasticsearch_url> is the --url parameter. <elasticsearch_url>--url参数。
  • <index> is the --index parameter. <index>--index参数。
  • <type> is the --type parameter. <type>--type参数。
  • <document_id> is the id field of the Sensu event . <document_id>Sensu事件id字段。

There are two things to note about why this is happening the way you described: 关于这为什么以您描述的方式发生有两点需要注意:

  1. A Sensu event, regardless of the status of the event, will always have the same ID. 不论事件的status如何,Sensu事件将始终具有相同的ID。
  2. In Elasticsearch, POSTing a document will overwrite an existing document with the same ID ( _id field in Elasticsearch). 在Elasticsearch中,发布文档将覆盖具有相同ID(Elasticsearch中的_id字段)的现有文档。

It seems like you're looking for a unique Elasticsearch document per Sensu occurrence , which would just be a matter of modifying the Sensu handler to write to a URL that has a truly unique <document_id> instead of one unique to the Sensu event. 似乎您每次Sensu都在寻找唯一的Elasticsearch文档,这仅仅是修改Sensu处理程序以写入具有真正唯一<document_id>而不是Sensu事件唯一的URL的问题。 Elasticsearch will handle generating unique document IDs automatically, if you let it. 如果您允许,Elasticsearch将自动处理生成唯一的文档ID。 That means you should be able to resolve this fairly easily, by modifying line 53 of the handler from: 这意味着您应该能够通过以下方式修改处理程序的第53行来轻松解决此问题:

uri = URI("#{config[:url]}/#{index}/#{config[:type]}/#{@event['id']}")

to: 至:

uri = URI("#{config[:url]}/#{index}/#{config[:type]}")

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM