简体   繁体   English

Java客户端如何将文件上传到远程HDFS服务器或从远程HDFS服务器下载文件?

[英]How a Java client upload/download a file to/from remote HDFS server?

I run aa Pseudo mode of Hadoop-2.7.1 HDFS in my Fedora Virtual Machine(in VMware workstation). 我在Fedora虚拟机(在VMware工作站中)中运行Hadoop-2.7.1 HDFS的伪模式。 I can upload/download a local file in Fedora with hadoop hdfs shell commands. 我可以使用hadoop hdfs shell命令在Fedora中上载/下载本地文件。

But how can I write a simple java class to upload/download a file from my windows host? 但是,如何编写一个简单的Java类从Windows主机上载/下载文件?

I find some example codes like: 我发现一些示例代码,例如:

FileSystem fs = FileSystem.get(new Configure());
Path local_file = new Path("testfile.txt");
Path remote_path = new Path("/");
fs.copyFromLocal(local_file, remote_path);

But I find the hadoop-core-1.2.jar in Maven repository, but it's too old version for 2.7.1. 但是我在Maven存储库中找到了hadoop-core-1.2.jar,但是对于2.7.1来说它太旧了。 I don't know what jar package to use to import the HDFS java class. 我不知道使用什么jar包来导入HDFS Java类。

Try this example using-filesystem-api-to-read-and-write-data-to-hdfs code with below maven configuration : 尝试使用以下具有maven配置的文件系统API读取和写入数据到hdfs代码的示例:

<properties>
    <hadoop.version>2.7.0</hadoop.version>
    <hadoop.core>1.2.1</hadoop.core>
</properties>

<dependencies>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>${hadoop.core}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
</dependencies>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM