简体   繁体   English

如何将文件从远程服务器复制到hdfs位置

[英]How to copy files from a remote server to hdfs location

I want to copy files from a remote server using sftp to an hdfs location directly without copying the files to local. 我想使用sftp将文件从远程服务器直接复制到hdfs位置,而无需将文件复制到本地。 The hdfs location is a secured cluster. hdfs位置是受保护的群集。 Please suggest if this is feasible and how to proceed in that case. 请建议这是否可行以及在这种情况下如何进行。 Also I would want to know if there is any other way to connect and copy apart from sftp. 我也想知道是否还有其他方法可以连接和复制sftp。

I think the most convenient way (given that your remote machine is able to connect to the hadoop cluster) is to make that remote machine act as an HDFS client. 我认为最方便的方法(假设您的远程计算机能够连接到hadoop群集)是使该远程计算机充当HDFS客户端。 Just ssh to that machine, install the hadoop distribution, configure it properly, then run: 只需SSH到那台机器,安装hadoop发行版,对其进行正确配置,然后运行:

hadoop fs -put /local/path /hdfs/path

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM