简体   繁体   中英

Collect kubernetes' pods logs

I'm trying to collect my application containers' logs throughout their entire life cycle. These containers are running inside Kubernetes pods, I've found solutions like Fluentd but I also found out that I need to specify a backend (Elasticsearch, AWS S3, etc.) whereas I want to collect logs inside files having specific names, for example podname_namespace_containername.json and then parse those files using a script. Is this possible with fluentd?

By far the fastest way to setup log collection is https://github.com/helm/charts/tree/master/stable/fluent-bit . Refer values.yaml for all the options available. It supports multiple backends like ES, S3, Kafka. Every log event is enriched with pod metadata (pod name, namespace, etc) and tagged so that you can organize processing separately on a backend. Eg on a backend you can select and parse only certain pods in certain namespaces.

According to https://kubernetes.io/docs/concepts/cluster-administration/logging/ you log to stdout/stderr, it gets written to the underlying node, a log collector (daemonset) collects everything and sends further.

在此处输入图片说明

FluentBit daemonset in Kubernetes implements exactly this architecture. More docs on FluentBit: https://docs.fluentbit.io/manual/concepts/data-pipeline

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM