简体   繁体   中英

getting error in importing spark dependencies in intellij idea

I am using intelli j idea with maven integration but I am getting error on following lines

import org.apache.spark.SparkConf;

import org.apache.spark.api.java.JavaRDD;

import org.apache.spark.api.java.JavaSparkContext;

import org.apache.spark.api.java.function.Function;

I am trying to run following example

package com.spark.hello;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;

public class Hello {

      public static void main(String[] args) {
            String logFile = "F:\\Spark\\a.java"; 
            SparkConf conf = new SparkConf().setAppName("Simple Application");
            JavaSparkContext sc = new JavaSparkContext(conf);
            JavaRDD<String> logData = sc.textFile(logFile).cache();

            long numAs = logData.filter(new Function<String, Boolean>() {
              public Boolean call(String s) { return s.contains("a"); }
            }).count();

            long numBs = logData.filter(new Function<String, Boolean>() {
              public Boolean call(String s) { return s.contains("b"); }
            }).count();

            System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
          }






}

plz help me to solve this issue or is there any other way to run this kind of project???

Without seeing the error, I'm guessing the IDE is telling you they are unused imports be sure to double check the dependencies and the versions.

Alt + Enter is the shortcut I've used to resolve many of the issues.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM