I have found a sensu handler on GitHub for shipping data to elasticsearch. It can be seen here https://github.com/m4ce/sensu-handlers-elasticsearch . I have configured this handler to send the keepalive events to Elasticsearch. When a critical event is generated, the log appears in Elasticsearch. However, when this event is resolved, the critical log is overwritten and replaced with the resolved log .I need to track both the critical and resolved logs so I can't have any logs being overwritten. Has anyone had the same problem, or does anyone know how to resolve this problem?
Thanks,
AM
The URI that the Sensu handler is POSTing to is:
<elasticsearch_url>/<index>/<type>/<document_id>
(you can see this on line 53 of the handler where the URI is built)
Where:
<elasticsearch_url>
is the --url
parameter. <index>
is the --index
parameter. <type>
is the --type
parameter. <document_id>
is the id
field of the Sensu event . There are two things to note about why this is happening the way you described:
status
of the event, will always have the same ID. _id
field in Elasticsearch). It seems like you're looking for a unique Elasticsearch document per Sensu occurrence , which would just be a matter of modifying the Sensu handler to write to a URL that has a truly unique <document_id>
instead of one unique to the Sensu event. Elasticsearch will handle generating unique document IDs automatically, if you let it. That means you should be able to resolve this fairly easily, by modifying line 53 of the handler from:
uri = URI("#{config[:url]}/#{index}/#{config[:type]}/#{@event['id']}")
to:
uri = URI("#{config[:url]}/#{index}/#{config[:type]}")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.