简体   繁体   English

转发码头工具GELF使用Filebeat(或其他?)登录Logstash

[英]Forwarding docker GELF logs to Logstash with Filebeat (or alternative?)

Gelf messages are a subset of all Json Strings. Gelf消息是所有Json Strings的子集。 How can I use filebeat (or an alternative) as a lightweight solution to forward docker gelf logs reliably to logstash? 如何使用filebeat(或替代方案)作为轻量级解决方案将docker gelf日志可靠地转发到logstash?

Further info: 更多信息:

I have a cluster (docker swarm for now) of machines in the same network running docker containers. 我有一个运行docker容器的同一网络中的机器集群(现在是docker swarm)。 I want to use --log-driver=gelf because I like the gelf format and want the fields that docker adds to each GELF log entry. 我想使用--log-driver = gelf,因为我喜欢gelf格式,并希望docker添加到每个GELF日志条目的字段。

Unfortunately docker sends GELF logs with UDP and I fear loosing log entries. 不幸的是,docker使用UDP发送GELF日志,我担心丢失日志条目。 Either because packages are lost, logstash is down, or there is too much load for logstash. 因为包丢失,logstash关闭,或者logstash的负载太大。 I don't want to run logstash on each host because it is a heavyweight. 我不想在每台主机上运行logstash,因为它是一个重量级的。

Try to place Rabbitmq or Redis in front of GELF. 尝试将Rabbitmq或Redis放在GELF前面。

You'll want to split the filtering from the ingestion in a centralized manner, add several Logstash shippers or just have a way to buffer new logs from any type of slower parsing. 您需要以集中方式从提取中拆分过滤,添加几个Logstash托运者或只是有办法从任何类型的较慢解析缓冲新日志。 You can split the original log.conf into two files depending if they're reading into Redis or grabbing from the queue, parsing and sending to ES. 您可以将原始log.conf拆分为两个文件,具体取决于它们是否正在读取Redis或从队列中获取,解析并发送到ES。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM