简体   繁体   English

如何使用 Airflow 在 python 中运行 bash 脚本

[英]How to run bash script in python using Airflow

I have a python function我有一个python函数

def running_dump():
    with open('dags/scripts/shell_scripts/daily_pg_dump.sh', 'rb') as file:
        script = file.read()
        print(script)
    subprocess.call(script, shell=True)

and a shell file daily_pg_dump.sh和一个 shell 文件daily_pg_dump.sh

PGPASSWORD='*******' pg_dump -h ***** -p ***** -U ***** -d * -t table_1 > dags/data_bucket/table_1_backup.sql

Airflow Dag气流标签

pg_dump_to_storage = PythonOperator(
        task_id='task_1',
        python_callable=running_dump,
        dag=dag
    )

When I call the python function using Airflow, the shell script seems not to run because table_1_backup.sql is not created.当我使用 Airflow 调用 python 函数时,shell 脚本似乎没有运行,因为table_1_backup.sql没有被创建。 Instead I'm getting Returned value was: 0 but no error appear.相反,我得到的返回值为:0但没有出现错误。 What I'm I missing?我错过了什么?

If you want to run bash scripts from Airflow, you can use BashOperator instead of PythonOperator .如果你想从 Airflow 运行 bash 脚本,你可以使用BashOperator而不是PythonOperator

From this example in the documentation, in your case it would be:从文档中的这个例子来看,你的情况是:

from airflow.operators.bash import BashOperator

running_dump = “path/to/daily_pg_dump.sh ” # note the space after the script's name

pg_dump_to_storage = BashOperator(
   task_id='task_1', 
   bash_command=running_dump,
   dag=dag,
)

NOTE: You only need the space after the script's path name when you are NOT using templating in your bash script.注意:当您不在bash 脚本中使用模板时,您只需要脚本路径名后面的空格

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM