简体   繁体   English

编译Spark类时,“ Scala Eclipse插件”中出现错误

[英]Error in “Eclipse Plugin for Scala” while compiling a Spark class

I am using CDH5.1.0 to do some simple Spark programming. 我正在使用CDH5.1.0进行一些简单的Spark编程。 Also, I have Eclipse Juno (comes with the VM) and installed Scala IDE plugin 2.10.0. 另外,我有Eclipse Juno(VM附带)并安装了Scala IDE插件2.10.0。 I am getting the following error in the IDE: 我在IDE中遇到以下错误:

Bad symbolic reference. 错误的符号引用。 A signature in SparkContext.class refers to term io in package org.apache.hadoop which is not available. SparkContext.class中的签名是指软件包org.apache.hadoop中的术语io,该术语不可用。 It may be completely missing from the current classpath, or the version on the classpath might be incompatible with the version used when compiling SparkContext.class. 当前类路径可能会完全丢失它,或者类路径上的版本可能与编译SparkContext.class时使用的版本不兼容。 SimpleApp.scala /MyScalaProject/src/com/test/spark1 line 10 Scala Problem SimpleApp.scala / MyScalaProject / src / com / test / spark1第10行Scala问题

Code: 码:

package com.test.spark1
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/home/Desktop/scala/sparktest.txt" // Should be some file on your system
    val conf = new org.apache.spark.SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s").format(numAs, numBs)
  }
}

I get the same error at line# 10 (var conf - new org.apache.spark.SparkCon...) and also line# 15 (println...). 我在第10行(var conf-new org.apache.spark.SparkCon ...)和第15行(println ...)遇到相同的错误。

My project build path has /usr/lib/spark/assembly/lib/spark-assembly-1.0.0-cdh5.1.0-hadoop2.3.0-cdh5.1.0.jar and I checked all necessary classes for this simple scala program are there. 我的项目构建路径具有/usr/lib/spark/assembly/lib/spark-assembly-1.0.0-cdh5.1.0-hadoop2.3.0-cdh5.1.0.jar ,我检查了此简单scala程序的所有必需类。

The compilation error went away once I added the following jar in the build path: 在构建路径中添加以下jar之后,编译错误就消失了:

hadoop-common-2.3.0-cdh5.1.0.jar Hadoop的共同-2.3.0-cdh5.1.0.jar

so there was some internal dependency that was missing causing this error. 因此缺少一些内部依赖性导致此错误。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM