简体   繁体   中英

How to access Ubuntu VM HDFS in Windows?

I am trying to run a SPARK application from Eclipse on Windows. I have to use a Hadoop cluster. I have my hadoop hdfs running on Ubuntu virtual machine.

Whenever I try to pass the hdfs path to windows eclipse (in my case it is hdfs://localhost:9000), it gives a java.net.ConnectException, with a message saying that connection to localhost:9000 failed.

I have SSH installed on ubuntu. I tried passing the ip of the ubuntu machine instead of 'localhost', but it also didn't work. I played around with core-site.xml changing the name of the default fs (i tried the name of the machine as well as the ip address), with no luck.

How can I get my windows machine to be able to access the hdfs cluster in my ubuntu virtual machine?

I don't know why you're unable to run your job. But if you want to connect to your ubuntu through windows then you can use PUTTY software available for free. You need to pass your IP-address of your machine into putty and you're good to go.

Talking about Spark job, make sure your cluster is up and working. For that, try and run "jps" command and see if all the services are up. Mainly, Namenode, Datanode, yarn, MapReduce, spark, hive.

If all these services up then make sure your command is correct. If you still face any issue, post it in the comment I will resolve it for you.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM