简体   繁体   中英

How a Java client upload/download a file to/from remote HDFS server?

I run aa Pseudo mode of Hadoop-2.7.1 HDFS in my Fedora Virtual Machine(in VMware workstation). I can upload/download a local file in Fedora with hadoop hdfs shell commands.

But how can I write a simple java class to upload/download a file from my windows host?

I find some example codes like:

FileSystem fs = FileSystem.get(new Configure());
Path local_file = new Path("testfile.txt");
Path remote_path = new Path("/");
fs.copyFromLocal(local_file, remote_path);

But I find the hadoop-core-1.2.jar in Maven repository, but it's too old version for 2.7.1. I don't know what jar package to use to import the HDFS java class.

Try this example using-filesystem-api-to-read-and-write-data-to-hdfs code with below maven configuration :

<properties>
    <hadoop.version>2.7.0</hadoop.version>
    <hadoop.core>1.2.1</hadoop.core>
</properties>

<dependencies>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>${hadoop.core}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
</dependencies>

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM