繁体   English   中英

无法将 Spark 与来自 eclipse 的 Hortonworks Sandbox 连接

[英]Can't Connect Spark with Hortonworks Sandbox from eclipse

我无法连接写在 eclipse 上的 spark 代码。

下面是代码,请指导我如何做同样的事情。 任何事情都会有帮助

> 
>     import java.util.Arrays;
> 
>     import org.apache.spark.SparkConf;
>     import org.apache.spark.api.java.JavaPairRDD;
>     import org.apache.spark.api.java.JavaRDD;
>     import org.apache.spark.api.java.JavaSparkContext;
> 
>      public class SparkTest {
> 
        public static void main(String[] args) {

>          SparkConf conf = new SparkConf()
              .setAppName("JD Word Counter").setMaster("local");

> 
>          JavaSparkContext sc = new JavaSparkContext(conf);
>               //hdfs://localhost:8020/user/root/textfile/test.txt         
           JavaRDD<String> inputFile = sc.textFile("hdfs://localhost:8020/user/root/textfile/test.txt");

>          System.out.println("Hello start");
>          System.out.println(inputFile.collect());         
           JavaRDD<String> wordsFromFile = inputFile.flatMap(content ->
            Arrays.asList(content.split(" ")).iterator());

>          System.out.println("hello end");
>       
> 
>          //JavaPairRDD countData = wordsFromFile.mapToPair(t -> new Tuple2(t, 1)).reduceByKey((x, y) -> (int) x + (int) y);
          //wordsFromFile
           .saveAsTextFile("hdfs://localhost:8020/user/root/fileTest/");

> 
>          System.out.println(" This java program is complete");    

       }
> 
>     }
>

错误:

> I/O error constructing remote block reader.
> org.apache.hadoop.net.ConnectTimeoutException: 60000 millis timeout
> while waiting for  channel to be ready for connect. ch :
> java.nio.channels.SocketChannel[connection-pending 
> remote=/172.18.0.2:50010] at org.apache.hadoop.net.NetUtils.c

localhost更改为 hdp 沙箱的ip address或将hdfs-site.xml文件放在您的类路径中并确保所有端口都已打开并可从外部机器访问。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM