简体   繁体   中英

Write to external Hadoop with Spark

I'm using Java-Spark.

I'm trying to write to external HDFS directory as follow:

df.write().mode(mode).save("hdfs://myservername:8020/user/path/to/hdfs");

And got an exception

host details: local host is: ... destination host is: ...

How can I write to "external" hdfs directory from Spark and not to local Hadoop/HDFS?

Thanks

Check if the HDFS Namenode hostname is accessible from Spark cluster, you can either use ip address as well.

hdfs://<HDFS_NAMENODE_IP>:8020/user/path/to/hdfs

You can also update the spark configuration in the spark application using:

spark.conf.set("fs.defaultFS", "hdfs://<HDFS_NAMENODE_IP>:8020/") .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM