简体   繁体   中英

How to setup remote debug for Airflow running with official docker-compose?

I have tried to follow this article: https://medium.com/@andrewhharmon/apache-airflow-using-pycharm-and-docker-for-remote-debugging-b2d1edf83d9d .

The problematic parts are:

  • Which container should I select when creating Python Interpreter?
  • Is it correct to setup debug config to run Airflow's binary directly? When I use airflow worker as interpreter and run in debug mode I got
data-pipeline-airflow-worker-1  | /home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:360: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
data-pipeline-airflow-worker-1  |   FutureWarning,
data-pipeline-airflow-worker-1  | 
data-pipeline-airflow-worker-1  | airflow command error: the following arguments are required: GROUP_OR_COMMAND, see help above.
data-pipeline-airflow-worker-1  | usage: airflow [-h] GROUP_OR_COMMAND ...
data-pipeline-airflow-worker-1  | 
data-pipeline-airflow-worker-1  | positional arguments:
data-pipeline-airflow-worker-1  |   GROUP_OR_COMMAND
data-pipeline-airflow-worker-1  | 
data-pipeline-airflow-worker-1  |     Groups:
data-pipeline-airflow-worker-1  |       celery         Celery components
data-pipeline-airflow-worker-1  |       config         View configuration
data-pipeline-airflow-worker-1  |       connections    Manage connections
data-pipeline-airflow-worker-1  |       dags           Manage DAGs
data-pipeline-airflow-worker-1  |       db             Database operations
data-pipeline-airflow-worker-1  |       jobs           Manage jobs
data-pipeline-airflow-worker-1  |       kubernetes     Tools to help run the KubernetesExecutor
data-pipeline-airflow-worker-1  |       pools          Manage pools
data-pipeline-airflow-worker-1  |       providers      Display providers
data-pipeline-airflow-worker-1  |       roles          Manage roles
data-pipeline-airflow-worker-1  |       tasks          Manage tasks
data-pipeline-airflow-worker-1  |       users          Manage users
data-pipeline-airflow-worker-1  |       variables      Manage variables

Ok I found the answer, you need to setup Python interpreter in airflow-worker. For script path it should be /home/airflow/.local/bin/airflow and the parameter should be tasks test [dag_id] [task_id] [start_date] . I am using Airflow 2.3.2

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM