[英]Send a file from Hadoop to a remote server via sftp
I have some files in a directory on HDFS and I need to move them via SFTP to a remote server.我在 HDFS 上的目录中有一些文件,我需要通过 SFTP 将它们移动到远程服务器。 Normally, what I do is that I download the files from the HDFS to a unix folder with
通常,我所做的是将文件从 HDFS 下载到一个带有
hdfs dfs -get /hdfs_path/folder/file.txt /unix_path/destination/path
and then i move it with sftp as:然后我用 sftp 将它移动为:
sftp -v user@remoteServer <<EOF
lcd /unix_path/destination/path
cd /remote_folder/path/
put file.txt
quit
EOF
What I want to know: is there any way to do the direct sending of the file via sftp to the remote server from hadoop without needing to do the previous hdfs dfs -get?我想知道的是:有什么方法可以通过 sftp 将文件从 hadoop 直接发送到远程服务器,而无需执行之前的 hdfs dfs -get?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.