[英]Eclipse Scala IDE code not compiling
I downloaded eclipse scala ide from scala-ide.org site and trying to compile my first scala word count program. 我从scala-ide.org网站下载了eclipse scala ide,并尝试编译我的第一个scala单词计数程序。 But its gives error "object not a member of package org" in the following import command
但是在以下导入命令中给出了错误“对象不是软件包组织的成员”
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
After some research I found that I need to add the jar file spark-assembly-1.0.0-hadoop2.2.0.jar to overcome this issue 经过一些研究,我发现我需要添加jar文件spark-assembly-1.0.0-hadoop2.2.0.jar来解决此问题
But after doing lot of research I could not locate this jar. 但是经过大量研究,我无法找到这个罐子。 Can anyone help here ?
有人可以帮忙吗?
Scala is not a simple language/env to learn. Scala不是一种简单的语言/环境。 It is important you learn how scala works and then move into spark.
了解scala的工作原理然后进入火花很重要。 There are tons of material available on web.
网络上有大量可用的材料。 A proper learning path will be to learn SBT > SCALA > Use Scala for Spark
正确的学习途径是学习SBT> SCALA>将Scala用于Spark
The dependency that you mentioned, can be put in he sbt's build.sbt. 您提到的依赖项可以放在sbt的build.sbt中。 You can also use maven, but I recommend learning sbt as way of learning scala.
您也可以使用Maven,但我建议学习sbt作为学习scala的方式。 Once you have resolved, the dependency using SBT, your simple code should work fine.
使用SBT解决依赖关系后,简单的代码应该可以正常工作。 But still, I recommend doing a "hello world" first than doing a "word count" :-)
但是我还是建议先做一个“ hello world”,而不要做“字数统计” :-)
Ando to answer your question, in your SBT you should be adding following library, 安藤回答您的问题,在您的SBT中,您应该添加以下库,
libraryDependencies += "org.apache.spark" % "spark-assembly_2.10" % "1.1.1"
This was for spark assembly 1.1.1 for hadoop 2.10. 这是用于hadoop 2.10的火花组件1.1.1。 I see you need a different version, you can find the proper version at
我看到您需要其他版本,可以在以下位置找到合适的版本
Maven Repo details for spark/hadoop Maven Repo的Spark / Hadoop详细信息
Here's the pure eclipse solutions (I had to download and setup eclipse just to answer this question) 这是纯蚀解决方案(我必须下载并设置蚀才能回答此问题)
``` ```
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object WordCount {
val sparkConf = new SparkConf().setAppName("SampleTest")
val spark = new SparkContext(sparkConf)
val textFile = spark.textFile("hdfs://...")
val counts = textFile.flatMap(line => line.split(" "))
.map(word => (word, 1))
.reduceByKey(_ + _)
counts.saveAsTextFile("hdfs://...")
}
``` 6. Add following to your maven `6.将关注者添加到您的行家中
``` ```
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
``` ```
Hope this helps. 希望这可以帮助。
Make a build.sbt file with the following contents 使用以下内容制作一个build.sbt文件
name := """sparktest""" 名称:=“”“ sparktest”“”
version := "1.0-SNAPSHOT" 版本:=“ 1.0-SNAPSHOT”
scalaVersion := "2.11.7" scalaVersion:=“ 2.11.7”
libraryDependencies ++= Seq( "org.apache.spark" %% "spark-core" % "1.4.0" ) libraryDependencies ++ = Seq(“ org.apache.spark” %%“ spark-core”%“ 1.4.0”)
Configure the SBT Eclipse plugin. 配置SBT Eclipse插件。 Create ~/.sbt/0.13/plugins/plugins.sbt, with:
使用以下命令创建〜/ .sbt / 0.13 / plugins / plugins.sbt:
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "4.0.0") addSbtPlugin(“ com.typesafe.sbteclipse”%“ sbteclipse-plugin”%“ 4.0.0”)
Generate an Eclipse project with sbt eclipse
使用
sbt eclipse
生成Eclipse项目
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.