简体   繁体   English

如何将运行在GCP之上的Kubernetes上的pod中的日志发送到elasticsearch / logstash?

[英]How to ship logs from pods on Kubernetes running on top of GCP to elasticsearch/logstash?

I run new modules of my system in Google-Container-Engine. 我在Google-Container-Engine中运行我系统的新模块。 I would like to bring stdout and stderr from them (running in pods) to my centralised logstash. 我想将stdout和stderr从他们(在pods中运行)带到我的集中式logstash。 Is there an easy way to forward logs from pods to external logging service, eg, logstash or elasticsearch? 有没有一种简单的方法可以将日志从pod转发到外部日志服务,例如logstash或elasticsearch?

I decided to log directly to elasticsearch , an external virtual machine that can be access at elasticsearch.c.my-project.internal (I am on Google-Cloud-Platform). 我决定直接登录到elasticsearch ,这是一个外部虚拟机,可以通过elasticsearch.c.my-project.internal访问(我在Google-Cloud-Platform上)。 It is quite easy: 这很容易:

  1. Setup an ExternalService with name: elasticsearch that points to the elasticsearch instance: 使用名称设置ExternalService: elasticsearch指向elasticsearch实例:

    apiVersion: v1 kind: Service metadata: name: elasticsearch-logging namespace: kube-system labels: k8s-app: elasticsearch kubernetes.io/name: "elasticsearch" spec: type: ExternalName externalName: elasticsearch.c.my-project.internal ports: - port: 9200 targetPort: 9200

  2. Deploy a fluentd-elasticsearch as a DeamonSet. 将一个流畅的弹性搜索部署为DeamonSet。 fluentd-elasticsearch will automatically connect to service with name elasticsearch-logging (based on a fluentd-elasticsearch deployment defintion : fluentd-elasticsearch将自动连接到名为elasticsearch- elasticsearch-logging (基于流畅的弹性搜索部署定义

    apiVersion: extensions/v1beta1 kind: DaemonSet metadata: name: fluentd-elasticsearch namespace: kube-system labels: tier: monitoring app: fluentd-logging k8s-app: fluentd-logging spec: template: metadata: labels: name: fluentd-elasticsearch spec: containers: - name: fluentd-elasticsearch image: gcr.io/google_containers/fluentd-elasticsearch:1.19 volumeMounts: - name: varlog mountPath: /var/log - name: varlibdockercontainers mountPath: /var/lib/docker/containers readOnly: true terminationGracePeriodSeconds: 30 volumes: - name: varlog hostPath: path: /var/log - name: varlibdockercontainers hostPath: path: /var/lib/docker/containers

    Use kubectl logs fluentd-elasticsearch-... to check whether you were able to connect to the elasticsearach instance. 使用kubectl logs fluentd-elasticsearch-...检查您是否能够连接到elasticsearach实例。

  3. Now, you can access kibana and see the logs. 现在,您可以访问kibana并查看日志。

you can create a sink form the logs in stack-driver to pub-sub and then use the logstash-input-google_pubsub plugin - which exports all the logs to elastic using logstash-input-google_pubsub image , see source code 你可以创建一个接收器,从stack-driverpub-sub的日志,然后使用logstash-input-google_pubsub插件 - 使用logstash-input-google_pubsub图像将所有日志导出为弹性,请参阅源代码

export logs to pub-sub 将日志导出到pub-sub

  1. create a topic and a subscription in pubsub follow instrutions here 在pubsub中创建一个主题和一个订阅,按照这里的说明进行操作

  2. in the log viewer page click on create export , make sure you are filtered to your app's logs (GKE Container -> cluster-name, app-name), enter a sink name, choose Cloud Pubsub as Sink Service , now choose your topic in Sink Destination. 在日志查看器页面中单击create export ,确保过滤到应用程序的日志(GKE容器 - >集群名称,应用程序名称),输入接收器名称,选择Cloud Pubsub作为接收服务,现在选择您的主题接收器目的地。

logs from now and on are exported to pub-sub 从现在开始和之后的日志都会导出到pub-sub

configure logstash pipeline 配置logstash管道

here is the pubsub-elastic.conf file: 这是pubsub-elastic.conf文件:

input {
    google_pubsub {
        project_id => "my-gcloud-project-id"
        topic => "elastic-pubsub-test"
        subscription => "elastic-pubsub-test"
        json_key_file => "/etc/logstash/gcloud-service-account-key.json"
    }
}


output {
    elasticsearch {
        hosts => "https://example.us-east-1.aws.found.io:9243"
        user => "elastic"
        password => "mypassword"
    }
}

here is my Docker file: 这是我的Docker文件:

FROM sphereio/logstash-input-google_pubsub


# Logstash config
COPY gcloud-service-account-key.json /etc/logstash/gcloud-service-account-key.json
COPY config /etc/logstash/conf.d
COPY logstash.yml /etc/logstash/logstash.yml

now you should build the image and run 现在你应该建立图像并运行

if running on kubernetes use the following: 如果在kubernetes上运行,请使用以下命令:

here is deployment.yaml 这是deployment.yaml

apiVersion: extensions/v1beta1 kind: Deployment metadata: name: logstash-input-google-pubsub spec: replicas: 1 strategy: type: RollingUpdate template: metadata: labels: app: logstash-input-google-pubsub spec: containers: - name: logstash-input-google-pubsub image: us.gcr.io/my-gcloud-project-id/logstash-input-google_pubsub:1.0.0

build your image and push to registry 构建您的图像并推送到注册表

docker build --rm -t us.gcr.io/my-gcloud-project-id/logstash-input-google_pubsub:1.0.0 . 
gcloud docker -- push us.gcr.io/my-gcloud-project-id/logstash-input-google_pubsub:1.0.0

now create instance kubectl create -f deployment.yaml 现在创建实例kubectl create -f deployment.yaml

done!! 完成!

since elasticsearch 6.00 you can use filebeats 因为elasticsearch 6.00你可以使用filebeats

see blog 博客

Download Filebeat DaemonSet manifest 下载Filebeat DaemonSet清单

curl -L -O https://raw.githubusercontent.com/elastic/beats/6.0/deploy/kubernetes/filebeat-kubernetes.yaml

Update Elasticsearch connection details 更新Elasticsearch连接详细信息

- name: ELASTICSEARCH_HOST
 value: elasticsearch
- name: ELASTICSEARCH_PORT
 value: "9200"
- name: ELASTICSEARCH_USERNAME
 value: elastic
- name: ELASTICSEARCH_PASSWORD
 value: changeme

Deploy it to Kubernetes 将它部署到Kubernetes

kubectl create -f filebeat-kubernetes.yaml

You could try installing the following kubernetes addon: https://github.com/kubernetes/kubernetes/tree/master/cluster/addons/fluentd-elasticsearch 您可以尝试安装以下kubernetes插件: https//github.com/kubernetes/kubernetes/tree/master/cluster/addons/fluentd-elasticsearch

Haven't tried it myself but I'm also looking for proper logging. 我自己没试过,但我也在寻找合适的伐木方法。 The GCE logging is somehow limited to my opinion. GCE日志记录在某种程度上仅限于我的观点。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM