简体   繁体   中英

Java removes a slash from path and later gives me NoSuchFileException

I am trying to write in Hadoop HDFS, using this line of code:

Files.write(Paths.get("hdfs:////localhost:9000/user/cloudera/trial/"+ "cat.txt","miao miao!".getBytes());

The Spark Application gives me this exception:

java.nio.file.NoSuchFileException: hdfs:/quickstart.cloudera:9000/user/cloudera/trial/cat2

Which, I am interpreting, gives an error because there is only one slash after "hdfs:" .
I remember I had already used the java.nio.Files methods to write in HDFS, so I would exclude that is the problem.
What should I do to prevent that exception?

EDIT: The import section

import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;

NO, you can not use java.nio.Files to write to HDFS. Java classes don´t know about the NameNode and DataNodes in hadoop cluster. You need to use hadoop libraries to communicate with HDFS.

Here I have an example to write to HDFS using Java:

https://github.com/lalosam/HadoopInExamples/blob/master/src/main/java/rojosam/utils/hdfs/CreateTestsFiles.java

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM