[英]Run airflow commands on official Airflow docker-compose
[英]Fail to run DAG on Airflow 1.10.14 running with docker-compose on official Apache\Airflow image
我一直在尝试设置 Airflow 1.10.14 以使用 docker-compose 在 docker 容器上执行基于 Python 的进程。主机是 Ubuntu 18 VM。
我的 Dockerfile:
FROM apache/airflow:1.10.14
USER root
RUN pip install --upgrade pip
RUN pip install --user psycopg2-binary
COPY airflow.cfg /opt/airflow/
RUN apt-get update && apt-get install -y \
libodbc1 \
python3-dev\
libevent-dev\
unixodbc-dev \
freetds-dev \
freetds-bin -y \
tdsodbc -y \
build-essential
# install dependencies
ADD requirements.txt .
RUN pip install -r requirements.txt
USER airflow
然后我执行:
docker build -t learning/airflow .
而我的 docker-compose.yml 是:
version: "3"
networks:
airflow:
services:
postgres:
image: "postgres:9.6"
container_name: "postgres"
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
ports:
- "5432:5432"
networks:
- airflow
# uncomment initdb if you need initdb at first run
initdb:
image: learning/airflow
entrypoint: airflow db init
depends_on:
- postgres
networks:
- airflow
webserver:
image: learning/airflow
restart: always
depends_on:
- postgres
volumes:
- ./dags:/opt/airflow/dags
ports:
- "8080:8080"
entrypoint: airflow webserver
healthcheck:
test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3
networks:
- airflow
scheduler:
image: learning/airflow
restart: always
depends_on:
- postgres
- webserver
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
entrypoint: airflow scheduler
healthcheck:
test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-scheduler.pid ]"]
interval: 30s
timeout: 30s
retries: 3
networks:
- airflow
我还使用此处显示的 airflow.cfg(稍作更改)在第一次运行中,我在单独的终端中分 3 个步骤执行:
docker-compose up postgres
docker-compose up initdb
docker-compose up webserver scheduler
我能够访问 Airflow 用户界面并打开 DAG,但第一步立即失败并出现以下错误:
*** Log file does not exist: /opt/airflow/logs/stg_process/Process_g/2020-12-23T00:00:00+00:00/2.log
*** Fetching from: http://bf23abdeb4b0:8793/log/stg_process/Process_g/2020-12-23T00:00:00+00:00/2.log
*** Failed to fetch log file from worker. HTTPConnectionPool(host='bf23abdeb4b0', port=8793): Max retries
exceeded with url:
/log/stg_process/Process_g/2020-12-23T00:00:00+00:00/2.log
(Caused by NewConnectionError('<urllib3.connection.HTTPConnection
object at 0x7ff6f20ae898>: Failed to establish a new connection:
[Errno 111] Connection refused',))
我在这里错过了什么? 任何帮助将不胜感激...
尝试重置元数据数据库然后重新构建?
airflow resetdb
我可能迟到了,但我认为问题出在您 Dockerfile 的这一行上:
COPY airflow.cfg /opt/airflow/
应该是这样的:
COPY airflow.cfg /opt/airflow/airflow.cfg
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.