简体   繁体   中英

Using Logstash to pass airflow logs to Elasticsearch

When using logstash to retrieve airflow logs from a folder you have access to, would I still need to make any changes in the airflow.cfg file?

For instance, I have airflow and ELK deployed on same ec2 instance. The logstash.conf file has access to the airflow logs path since they are on the same instance. Do I need to turn on remote logging in airflow config?

In fact you have two options to push airflow logs to Elastic Search:

  1. Using a log collector (logstash, fluentd, ...) to collect Airflow log then send it to Elastic Search server, in this case you don't need to change any Airflow config, you can just read the logs from the files or stdout and send it to ES.
  2. Using Airflow remote logging feature, in this case Airflow will log directly to your remote logging server (ES in your case), and will store a local version of this log to show it when the remote server is unavailable.

So the answer to your question is no, if you have a logstash, you don't need Airflow remote logging config

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM