简体   繁体   English

将tomcat日志从tomcat docker容器收集到Filebeat docker容器

[英]Collect tomcat logs from tomcat docker container to Filebeat docker container

I have a Tomcat docker container and Filebeat docker container both are up and running. 我有一个Tomcat docker容器和Filebeat docker容器都已启动并正在运行。

My objective: I need to collect tomcat logs from running Tomcat container to Filebeat container. 我的目标:我需要从运行的Tomcat容器到Filebeat容器收集tomcat日志。

Issue: I have no idea how to get collected log files from Tomcat container. 问题:我不知道如何从Tomcat容器获取收集的日志文件。

What I have tried so far: I have tried to create a docker volume and add tomcat logs to that volume and access that volume from filebeat container, but ended with no success. 到目前为止,我已经尝试了以下操作我尝试创建docker卷并将Tomcat日志添加到该卷并从filebeat容器访问该卷,但均未成功。

Structure: I have wrote docker-compose.yml file under project Logstash(root directory of the project) with following project structure.(Here I want to up and run Elasticsearch, Logstash, Filebeat and Kibana docker containers from one configuration file). 结构:我在项目Logstash(项目的根目录)下写入了docker-compose.yml文件,具有以下项目结构(这里我要从一个配置文件启动并运行Elasticsearch,Logstash,Filebeat和Kibana docker容器)。 docker-containers(root directory of the project) with following structure (here I want to up and run Tomcat, Nginx and Postgres containers from one configuration file). 具有以下结构的docker-containers(项目的根目录)(在这里我要从一个配置文件启动并运行Tomcat,Nginx和Postgres容器)。

  • Logstash: contain 4 main sub directories (Filebeat, Logstash, Elasticsearch and Kibana), ENV file and docker-compose.yml file. Logstash:包含4个主要子目录(Filebeat,Logstash,Elasticsearch和Kibana),ENV文件和docker-compose.yml文件。 Both sub directories contain Dockerfiles to pull images and build the containers. 这两个子目录都包含Dockerfile,以提取图像并构建容器。

  • docker-containers: contains 3 main sub directories (Tomcat, Nginx and Postgres). docker-containers:包含3个主要子目录(Tomcat,Nginx和Postgres)。 ENV file and docker-compose.yml file. ENV文件和docker-compose.yml文件。 Both sub directories contain separate Dockerfiles to pull docker image and build the container. 两个子目录均包含单独的Dockerfile,以提取Docker映像并构建容器。

  • Note: I think this basic structure my helpful to understand my requirements. 注意:我认为此基本结构有助于理解我的要求。

docker-compose.yml files docker-compose.yml文件

Logstash.docker-compose.yml file Logstash.docker-compose.yml文件

version: '2'
services:


  elasticsearch:
    container_name: OTP-Elasticsearch
    build:
      context: ./elasticsearch
      args:
        - ELK_VERSION=${ELK_VERSION}
    volumes:
      - ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      ES_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk

  filebeat:
    container_name: OTP-Filebeat
    command:
      - "-e"
      - "--strict.perms=false"
    user: root
    build:
      context: ./filebeat
      args:
        - ELK_VERSION=${ELK_VERSION}
    volumes:
      - ./filebeat/config/filebeat.yml:/usr/share/filebeat/filebeat.yml
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    depends_on: 
      - elasticsearch
      - logstash

  logstash:
    container_name: OTP-Logstash
    build:
      context: ./logstash
      args:
        - ELK_VERSION=${ELK_VERSION}
    volumes:
      - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
      - ./logstash/pipeline:/usr/share/logstash/pipeline:ro
    expose:
      - 5044/tcp
    ports:
      - "9600:9600"
      - "5044:5044"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    links:
      - elasticsearch
    depends_on:
      - elasticsearch


  kibana:
    container_name: OTP-Kibana
    build:
      context: ./kibana
      args:
        - ELK_VERSION=${ELK_VERSION}
    volumes:
      - ./kibana/config/:/usr/share/kibana/config:ro
    ports:
      - "5601:5601"
    networks:
      - elk
    links:
      - elasticsearch
    depends_on: 
      - elasticsearch
      - logstash
      - filebeat

networks:
  elk:
    driver: bridge

docker-containers.docker-compose.yml file docker-containers.docker-compose.yml文件

version: '2'
services:

  # Nginx
  nginx:
    container_name: OTP-Nginx
    restart: always
    build: 
      context: ./nginx
      args:
        - comapanycode=${COMPANY_CODE}
        - dbtype=${DB_TYPE}
        - dbip=${DB_IP}
        - dbname=${DB_NAME}
        - dbuser=${DB_USER}
        - dbpassword=${DB_PASSWORD}
        - webdirectory=${WEB_DIRECTORY}
    ports:
      - "80:80"
    links:
      - db:db
    volumes:
      - ./log/nginx:/var/log/nginx
    depends_on:
      - db

  # Postgres
  db:
    container_name: OTP-Postgres
    restart: always
    ports:
      - "5430:5430"
    build: 
      context: ./postgres
      args:
        - food_db_version=${FOOD_DB_VERSION}
        - dbtype=${DB_TYPE} 
        - retail_db_version=${RETAIL_DB_VERSION}
        - dbname=${DB_NAME} 
        - dbuser=${DB_USER}
        - dbpassword=${DB_PASSWORD}
    volumes:
      - .data/db:/octopus_docker/postgresql/data

  # Tomcat
  tomcat:
    container_name: OTP-Tomcat
    restart: always
    build: 
      context: ./tomcat
      args:
        - dbuser=${DB_USER}
        - dbpassword=${DB_PASSWORD}
    links:
      - db:db
    volumes:
      - ./tomcat/${WARNAME}.war:/usr/local/tomcat/webapps/${WARNAME}.war
    ports:
      - "8080:8080"
    depends_on:
      - db
      - nginx 

Additional files: 附加文件:

filebeat.yml (configuration file inside Logstash/Filbeat/config/) filebeat.yml(Logstash / Filbeat / config /中的配置文件)

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /usr/local/tomcat/logs/.*log
output.logstash:
  hosts: ["logstash:5044"]

Additional Info: 附加信息:

  • System I am using is Ubuntu 18.04 我正在使用的系统是Ubuntu 18.04
  • My goal is to collect tomcat logs from running tomcat container and forward them to Logstash and filter logs and forward that logs to Elasticsearch and finally to Kibana for Visualization purpose. 我的目标是从运行tomcat容器中收集tomcat日志,并将其转发到Logstash并过滤日志,然后将该日志转发到Elasticsearch,最后转发到Kibana以进行可视化。
  • For now I can collect local machine(host) logs and visualize them in Kibana.(/var/log/) 现在,我可以收集本地计算机(主机)日志并在Kibana中将其可视化。(/ var / log /)

My Problem: 我的问题:

  • I need to know proper way to get collected tomcat logs from tomcat container and forward them to logstash container via filebeat container. 我需要知道从tomcat容器中收集tomcat日志并将其通过filebeat容器转发到logstash容器的正确方法。

Any discussion, answer or any help to understand a way to do this is highly expected. 期望进行任何讨论,回答或帮助您理解执行此操作的方式。

Thanks. 谢谢。

So loooong... Create shared volume among all containers and setup your tomcat to save log files into that folder. 好吧...在所有容器之间创建共享卷,并设置您的tomcat以将日志文件保存到该文件夹​​中。 If you can put all services into one docker-compose.yml , just setup volume internally: 如果您可以将所有服务放入一个docker-compose.yml ,只需在内部设置卷即可:

docker-compose.yml docker-compose.yml

version: '3'
services:
  one:
    ...
    volumes:
      - logs:/var/log/shared
  two:
    ...
    volumes:
      - logs:/var/log/shared
volumes:
  logs:

If you need several docker-compose.yml files, create volume globally in advance with docker volume create logs and map it into both compose files: 如果您需要多个docker-compose.yml文件,请使用docker-compose.yml docker volume create logs预先全局创建卷,并将其映射到两个撰写文件中:

version: '3'
services:
  one:
    ...
    volumes:
      - logs:/var/log/shared
  two:
    ...
    volumes:
      - logs:/var/log/shared
volumes:
  logs:
    external: true

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM