简体   繁体   中英

Airflow not creating log files or showing logs in task instance on UI

Im getting the following airflow issue:

When I run Dags that have mutiple tasks in it, randomly airflow set some of the tasks to failed state, and also doesn't show any logs on the UI. I went to my running worker container and saw that the log files for those failed tasks were also not created.

Going to Celery Flower, I found these logs on failed tasks:

airflow.exceptions.AirflowException: Celery command failed on host

How to solve this?

在此处输入图像描述 My environment is:

  • airflow:2.3.1
  • Docker compose
  • Celery Executor
  • Worker, webserver, scheduler and triggerer in different containers Docker compose hosted on Ubuntu

I also saw this https://stackoverflow.com/a/69201032/11949273 answer that might be related.

Anyone with these same issues?

Edit:

On my EC2 Instance I got more vCPU's and fine tuned airflow/celery workers parameters and solved this. Probably is some issue with lack of CPU and or something else.

I am faced with some issue. In my case in Inspect -> Console has some error with replaceAll in old browser (Chrome 83.X). Chrome 98.X does not have this issue.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM