简体   繁体   English

Filebeat未将日志发送到Logstash

[英]Filebeat is not sending the logs to the logstash

I am using filebeat and ELK stack.I am not getting the logs from filebeat to logstach. 我正在使用filebeat和ELK堆栈。我没有将日志从filebeat传递到logstach。 Can any one help. 谁能帮忙。

Filebeaat version : 6.3.0 ELK version : 6.0.0 Filebeaat版本:6.3.0 ELK版本:6.0.0

filebeat config :-- filebeat.prospectors: filebeat config:-filebeat.prospectors:

- type: log
  enabled: true
  paths:
    - '/var/lib/docker/containers/*/*.log'
  ignore_older: 0
  scan_frequency: 10s
  json.message_key: log
  json.keys_under_root: true
  json.add_error_key: true
  multiline.pattern: "^[[:space:]]+(at|\\.{3})\\b|^Caused by:"
  multiline.negate: false
  multiline.match: after
  registry_file: usr/share/filebeat/data/registry

output.logstash: hosts: ["172.31.34.173:5044"] output.logstash:主机:[“ 172.31.34.173:5044”]

Filebeat logs :-- Filebeat日志:-

2018-07-23T08:29:34.701Z        INFO    instance/beat.go:225    Setup Beat: filebeat; Version: 6.3.0
2018-07-23T08:29:34.701Z        INFO    pipeline/module.go:81   Beat name: ff01ed6d5ae4
2018-07-23T08:29:34.702Z        WARN    [cfgwarn]       beater/filebeat.go:61   DEPRECATED: prospectors are deprecated, Use `inputs` instead. Will be removed in version: 7.0.0
2018-07-23T08:29:34.702Z        INFO    [monitoring]    log/log.go:97   Starting metrics logging every 30s
2018-07-23T08:29:34.702Z        INFO    instance/beat.go:315    filebeat start running.
2018-07-23T08:29:34.702Z        INFO    registrar/registrar.go:75       No registry file found under: /usr/share/filebeat/data/registry. Creating a new registry file.
2018-07-23T08:29:34.704Z        INFO    registrar/registrar.go:112      Loading registrar data from /usr/share/filebeat/data/registry
2018-07-23T08:29:34.704Z        INFO    registrar/registrar.go:123      States Loaded from registrar: 0
2018-07-23T08:29:34.704Z        WARN    beater/filebeat.go:354  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018-07-23T08:29:34.704Z        INFO    crawler/crawler.go:48   Loading Inputs: 1
2018-07-23T08:29:34.705Z        INFO    log/input.go:111        Configured paths: [/var/lib/docker/containers/*/*.log]
2018-07-23T08:29:34.705Z        INFO    input/input.go:87       Starting input of type: log; ID: 2696038032251986622
2018-07-23T08:29:34.705Z        INFO    crawler/crawler.go:82   Loading and starting Inputs completed. Enabled inputs: 1
2018-07-23T08:30:04.705Z        INFO    [monitoring]    log/log.go:124  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":20,"time":{"ms":22}},"total":{"ticks":50,"time":{"ms":60},"value":50},"user":{"ticks":30,"time":{"ms":38}}},"info":{"ephemeral_id":"5193ce7d-8d09-4e9d-ab4e-e55a5972b4

Bit late to reply I know but I was having the same issue and after some searching, I found this layout to work for me. 我知道我来晚了一点回复,但是我遇到了同样的问题,经过一番搜索,我发现这种布局对我来说很有效。

filebeat.prospectors:
- paths:
    - '<path to your log>'
  multiline.pattern: '<whatever pattern is needed>'
  multiline.negate: true
  multiline.match:  after
  processors:
  - decode_json_fields:
      fields: ['<whatever field you need to decode']
      target: json

Here's a link to a similar problem. 这是一个类似问题的链接

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM