简体   繁体   中英

File transfer from one server to other server in Airflow

I have a file(file.txt) present in server1 with userid as "username1" in the path ( /home/A/file1.txt) and want to transfer this file to other server "server2" with userid as "username2" and want to place the file in the path (/home/B/). I have written below code and its not working as expected. Where did I go wrong?

import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
from airflow.contrib.operators.ssh_operator import SSHOperator
from airflow.contrib.hooks.ssh_hook import SSHHook
from airflow.contrib.sensors.sftp_sensor import SFTPSensor

default_args = {
         'owner': 'john',
         'depends_on_past': False,
         'email': [''],
         'email_on_failure': False,
         'email_on_retry': False,
         'retries': 0,
         'retry_delay': timedelta(minutes=5)
       }

  dag = DAG(
    'file transfer',
    default_args = default_args,
    description='A file transfer',
    schedule_interval=None)

 bash_file_transfer = """
  cp server1@username1:/home/A/file1.txt server2@username2:/home/B/  
  """

t1 = SSHOperator(
        ssh_conn_id='server1_conn'
        task_id='connected_to_server1'
        dag=dag
        )

t2 = SFTPOperator(
        sftp_conn_id='server2_conn'
        task_id='transfer file from server1 to server2'
        command=bash_file_transfer,
        dag=dag
        )

t1 >> t2

I think your mistake is on the bash_file_transfer declaration. Should be scp , not cp .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM