简体   繁体   English

如何使用 xcom 在 airflow SSHOperator 任务之间传递 output 数据?

[英]How to pass output data between airflow SSHOperator tasks using xcom?

Problem summary:问题总结:

  • I need to get stdout from one SSHOperator using xcom我需要使用 xcom 从一个 SSHOperator 获取标准输出
  • Filter some rows and get output values for passing them to another SSHOperator过滤一些行并获取 output 值,以便将它们传递给另一个 SSHOperator

Unfortunatelly I've not find anything helpful in the Airflow documentation不幸的是,我在 Airflow 文档中没有找到任何有用的信息

Code example:代码示例:

import airflow
from airflow.operators.dummy_operator import DummyOperator
from airflow.contrib.operators.ssh_operator import SSHOperator

default_args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime.datetime(2020, 1, 1, 0, 0),
}

dag = airflow.DAG(
    'example',
    default_args=default_args,
)

task_dummy = DummyOperator(
    task_id='task_dummy',
    dag=dag
)

cmd_ssh = """
for f in "file1" "file2"
do
    if $(hdfs dfs -test -d /data/$f)
        then hdfs dfs -rm -r -skipTrash /data/$f
        else echo "doesn't exists"
    fi
done
"""

task_1 = SSHOperator(
    ssh_conn_id='server_connection',
    task_id='task_ssh',
    command=cmd_ssh,
    do_xcom_push=True,
    dag=dag
)

My question is - how to access stdout from task_1 when I sed do_xcom_push=True?我的问题是 - 当我 sed do_xcom_push=True 时,如何从 task_1 访问标准输出?

You can access the XCom data in templated fields or callables which receive the Airflow context, such as the PythonOperator ( and its child classes ) -- from the documentation:您可以在接收 Airflow 上下文的模板化字段或可调用对象中访问XCom数据,例如PythonOperator及其子类)——来自文档:

# inside a PythonOperator called 'pushing_task'
def push_function():
    return value

# inside another PythonOperator where provide_context=True
def pull_function(**context):
    value = context['task_instance'].xcom_pull(task_ids='pushing_task')

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Airflow 2.0.2 如何使用xcom在postgres任务中传递参数? - Airflow 2.0.2 How to pass parameter within postgres tasks using xcom? Airflow 从 BashOperator 到 SSHOperator 的 XCOM 通信 - Airflow XCOM communication from BashOperator to SSHOperator 如何在Airflow中的SSHOperator中传递动态参数? - How to pass dynamic params in SSHOperator in Airflow? 如何测试使用XCom的Apache Airflow任务 - How to test Apache Airflow tasks that uses XCom Airflow XCom - 如何使用 TriggerDagRunOperator 在 DAG 之间共享变量? - Airflow XCom - how to share vars between DAGs using TriggerDagRunOperator? 无法在 airflow 任务之间传递数据帧 - Not able to pass data frame between airflow tasks Airflow DataprocSubmitJobOperator - 如何使用 XCOMS 或其他替代方式在任务之间传递数据 - Airflow DataprocSubmitJobOperator - how to pass data between tasks using XCOMS or other alternate ways 使用 XCOM 中任务的 JSON 表示在 Airflow 中运行时/动态生成任务 - Runtime/dynamic generation of tasks in Airflow using JSON representation of tasks in XCOM Airflow - 如何将 xcom 变量传递给 Python function - Airflow - How to pass xcom variable into Python function 如何更改 Airflow 中的 xcom 以适应大数据? - How to change xcom in Airflow to accomodate large data?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM