繁体   English   中英

如何为使用官方 docker-compose 运行的 Airflow 设置远程调试?

[英]How to setup remote debug for Airflow running with official docker-compose?

我试图关注这篇文章: https ://medium.com/@andrewhharmon/apache-airflow-using-pycharm-and-docker-for-remote-debugging-b2d1edf83d9d。

有问题的部分是:

  • 创建 Python Interpreter 时应该选择哪个容器?
  • 设置调试配置以直接运行 Airflow 的二进制文件是否正确? 当我使用气流工作者作为解释器并在调试模式下运行时,我得到了
data-pipeline-airflow-worker-1  | /home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py:360: FutureWarning: The auth_backends setting in [api] has had airflow.api.auth.backend.session added in the running config, which is needed by the UI. Please update your config before Apache Airflow 3.0.
data-pipeline-airflow-worker-1  |   FutureWarning,
data-pipeline-airflow-worker-1  | 
data-pipeline-airflow-worker-1  | airflow command error: the following arguments are required: GROUP_OR_COMMAND, see help above.
data-pipeline-airflow-worker-1  | usage: airflow [-h] GROUP_OR_COMMAND ...
data-pipeline-airflow-worker-1  | 
data-pipeline-airflow-worker-1  | positional arguments:
data-pipeline-airflow-worker-1  |   GROUP_OR_COMMAND
data-pipeline-airflow-worker-1  | 
data-pipeline-airflow-worker-1  |     Groups:
data-pipeline-airflow-worker-1  |       celery         Celery components
data-pipeline-airflow-worker-1  |       config         View configuration
data-pipeline-airflow-worker-1  |       connections    Manage connections
data-pipeline-airflow-worker-1  |       dags           Manage DAGs
data-pipeline-airflow-worker-1  |       db             Database operations
data-pipeline-airflow-worker-1  |       jobs           Manage jobs
data-pipeline-airflow-worker-1  |       kubernetes     Tools to help run the KubernetesExecutor
data-pipeline-airflow-worker-1  |       pools          Manage pools
data-pipeline-airflow-worker-1  |       providers      Display providers
data-pipeline-airflow-worker-1  |       roles          Manage roles
data-pipeline-airflow-worker-1  |       tasks          Manage tasks
data-pipeline-airflow-worker-1  |       users          Manage users
data-pipeline-airflow-worker-1  |       variables      Manage variables

好的,我找到了答案,您需要在气流工作者中设置 Python 解释器。 对于脚本路径,它应该是/home/airflow/.local/bin/airflow并且参数应该是tasks test [dag_id] [task_id] [start_date] 我正在使用气流 2.3.2

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM