简体   繁体   English

通过 sftp 从 Hadoop 发送文件到远程服务器

[英]Send a file from Hadoop to a remote server via sftp

I have some files in a directory on HDFS and I need to move them via SFTP to a remote server.我在 HDFS 上的目录中有一些文件,我需要通过 SFTP 将它们移动到远程服务器。 Normally, what I do is that I download the files from the HDFS to a unix folder with通常,我所做的是将文件从 HDFS 下载到一个带有

hdfs dfs -get /hdfs_path/folder/file.txt /unix_path/destination/path

and then i move it with sftp as:然后我用 sftp 将它移动为:

sftp -v user@remoteServer <<EOF
lcd /unix_path/destination/path
cd /remote_folder/path/
put file.txt
quit
EOF

What I want to know: is there any way to do the direct sending of the file via sftp to the remote server from hadoop without needing to do the previous hdfs dfs -get?我想知道的是:有什么方法可以通过 sftp 将文件从 hadoop 直接发送到远程服务器,而无需执行之前的 hdfs dfs -get?

Nifi was created to specifically handle this type of file movement. Nifi 的创建是为了专门处理这种类型的文件移动。 (It is a separate install). (这是一个单独的安装)。 You should check it out.你应该检查一下。 I am not aware of another way of doing it unless you wrote some code in spark.除非您在 spark 中编写了一些代码,否则我不知道另一种方法。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM