简体   繁体   中英

Configuration for sending Docker logs to a locally installed ELK using Gelf

I have my ELK deployed on an ec2 instance and a dockerized application running on a different instance. I am trying to use gelf to collect the different service logs and send to logstash. But my current configuration doesn't work.

Here's my docker.yaml file and my logstash conf file. For the gelf address I used the private ip of the instance where I have logstash running - is that what I should be using in this use case? What am I missing?

version: '3'
services:
  app:
    build: .
    volumes:
      - .:/app
    ports:
      - "8000:8000"
    links:
      - redis:redis
    depends_on:
      - redis
    logging:
      driver: gelf
      options:
        gelf-address: "udp://10.0.1.98:12201"
        tag: "dockerlogs"
  redis:
    image: "redis:alpine"
    expose:
      - "6379"
    logging:
      driver: gelf
      options:
        gelf-address: "udp://10.0.1.98:12201"
        tag: "redislogs"

This is my logstash conf:

input {
  beats {
    port => 5044
  }
  gelf {
    port:12201
    type=> "dockerLogs"
  }
}
output {
  elasticsearch {
    hosts => ["${ELK_IP}:9200"]
    index =>"logs-%{+YYYY.MM.dd}"
  }
}

Verify the version of docker once and check if the syntax is correct.

Docker resolves gelf address through the host's.network so the address needs to be the external address of the server.

Why not directly write to elasticsearch as you are only sending application logs without using logstash filter benefits?

see also: Using docker-compose with GELF log driver

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM