简体   繁体   中英

Send a file from Hadoop to a remote server via sftp

I have some files in a directory on HDFS and I need to move them via SFTP to a remote server. Normally, what I do is that I download the files from the HDFS to a unix folder with

hdfs dfs -get /hdfs_path/folder/file.txt /unix_path/destination/path

and then i move it with sftp as:

sftp -v user@remoteServer <<EOF
lcd /unix_path/destination/path
cd /remote_folder/path/
put file.txt
quit
EOF

What I want to know: is there any way to do the direct sending of the file via sftp to the remote server from hadoop without needing to do the previous hdfs dfs -get?

Nifi was created to specifically handle this type of file movement. (It is a separate install). You should check it out. I am not aware of another way of doing it unless you wrote some code in spark.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM