[英]File transfer from one server to other server in Airflow
I have a file(file.txt) present in server1 with userid as "username1" in the path ( /home/A/file1.txt) and want to transfer this file to other server "server2" with userid as "username2" and want to place the file in the path (/home/B/).我在 server1 中存在一个文件(file.txt),路径中的用户 ID 为“username1”(/home/A/file1.txt),并希望将此文件传输到其他服务器“server2”,用户 ID 为“username2”和要将文件放在路径(/home/B/)中。 I have written below code and its not working as expected.我写了下面的代码,但它没有按预期工作。 Where did I go wrong?我在哪里 go 错了?
import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
from airflow.contrib.operators.ssh_operator import SSHOperator
from airflow.contrib.hooks.ssh_hook import SSHHook
from airflow.contrib.sensors.sftp_sensor import SFTPSensor
default_args = {
'owner': 'john',
'depends_on_past': False,
'email': [''],
'email_on_failure': False,
'email_on_retry': False,
'retries': 0,
'retry_delay': timedelta(minutes=5)
}
dag = DAG(
'file transfer',
default_args = default_args,
description='A file transfer',
schedule_interval=None)
bash_file_transfer = """
cp server1@username1:/home/A/file1.txt server2@username2:/home/B/
"""
t1 = SSHOperator(
ssh_conn_id='server1_conn'
task_id='connected_to_server1'
dag=dag
)
t2 = SFTPOperator(
sftp_conn_id='server2_conn'
task_id='transfer file from server1 to server2'
command=bash_file_transfer,
dag=dag
)
t1 >> t2
I think your mistake is on the bash_file_transfer
declaration.我认为您的错误在于bash_file_transfer
声明。 Should be scp
, not cp
.应该是scp
,而不是cp
。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.