[英]How to pass output data between airflow SSHOperator tasks using xcom?
Problem summary:问题总结:
Unfortunatelly I've not find anything helpful in the Airflow documentation不幸的是,我在 Airflow 文档中没有找到任何有用的信息
Code example:代码示例:
import airflow
from airflow.operators.dummy_operator import DummyOperator
from airflow.contrib.operators.ssh_operator import SSHOperator
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime.datetime(2020, 1, 1, 0, 0),
}
dag = airflow.DAG(
'example',
default_args=default_args,
)
task_dummy = DummyOperator(
task_id='task_dummy',
dag=dag
)
cmd_ssh = """
for f in "file1" "file2"
do
if $(hdfs dfs -test -d /data/$f)
then hdfs dfs -rm -r -skipTrash /data/$f
else echo "doesn't exists"
fi
done
"""
task_1 = SSHOperator(
ssh_conn_id='server_connection',
task_id='task_ssh',
command=cmd_ssh,
do_xcom_push=True,
dag=dag
)
My question is - how to access stdout from task_1 when I sed do_xcom_push=True?我的问题是 - 当我 sed do_xcom_push=True 时,如何从 task_1 访问标准输出?
You can access the XCom data in templated fields or callables which receive the Airflow context, such as the PythonOperator
( and its child classes ) -- from the documentation:您可以在接收 Airflow 上下文的模板化字段或可调用对象中访问XCom数据,例如PythonOperator
(及其子类)——来自文档:
# inside a PythonOperator called 'pushing_task'
def push_function():
return value
# inside another PythonOperator where provide_context=True
def pull_function(**context):
value = context['task_instance'].xcom_pull(task_ids='pushing_task')
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.