[英]transfer file from local machine of 1 cluster to hdfs of another cluster
I have 2 hadoop clusters(A and B) and want to transfer a file from local of cluster A to HDFS of cluster B. Is there a way to do it? 我有2个hadoop群集(A和B),并且想要将文件从群集A的本地传输到群集B的HDFS。有没有办法做到这一点?
I tried copyFromLocal and put but looks like they don't copy the file over to the HDFS of cluster B and show that they are not supported: copyFromLocal: Not supported
我尝试了copyFromLocal并将其放入,但看起来它们没有将文件复制到群集B的HDFS中,并显示不支持它们: copyFromLocal: Not supported
fyi: connection looks open as I am able to read HDFS of cluster B from local of cluster A(hadoop fs -ls hdfs://NNofB:port/path) fyi:连接看起来打开,因为我能够从群集A的本地读取群集B的HDFS(hadoop fs -ls hdfs:// NNofB:port / path)
不知道从HDFS-> HDFS是否有直接方法,但是您可以从ClusterA中某个节点上的HDFS get
数据,将数据scp
到ClusterB中的节点,然后put
数据从ClusterB中的该节点放入HDFS。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.