[英]Error in running Spark in Intellij : "object apache is not a member of package org"
I am running a Spark program on Intellij and getting the below error : "object apache is not a member of package org".我在 Intellij 上运行 Spark 程序并收到以下错误:“object apache is not a member of package org”。
I have used these import statement in the code :我在代码中使用了这些导入语句:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
The above import statement is not running on sbt prompt too.上面的导入语句也没有在 sbt 提示符下运行。 The corresponding lib appears to be missing but I am not sure how to copy the same and at which path.
相应的库似乎丢失了,但我不确定如何复制相同的库以及在哪个路径上复制。
Make sure you have entries like this in SBT: 确保在SBT中有这样的条目:
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0"
)
Then make sure IntelliJ knows about these libraries by either enabling "auto-import" or doing it manually by clicking the refresh-looking button on the SBT panel. 然后确保IntelliJ通过启用“自动导入”或通过单击SBT面板上的刷新外观按钮手动执行来了解这些库。
It is about 5 years since the previous answer, but I had the same issue and the answer mentioned here did not work.距离上一个答案已经过去了大约 5 年,但我遇到了同样的问题,这里提到的答案不起作用。 So, hopefully this answer works for those who find themselves in the same position I was in.
所以,希望这个答案适用于那些发现自己处于与我相同位置的人。
I was able to run my scala program from sbt shell, but it was not working in Intellij.我能够从 sbt shell 运行我的 scala 程序,但它在 Intellij 中不起作用。 This is what I did to fix the issue:
这是我为解决此问题所做的工作:
File -> Open -> select build.sbt -> choose the "project" option.
文件 -> 打开 -> 选择 build.sbt -> 选择“项目”选项。
File -> settings -> Plugins -> search and install sbt.
文件 -> 设置 -> 插件 -> 搜索并安装 sbt。
Click "View" -> Tool Windows -> sbt.
单击“查看”-> 工具窗口-> sbt。 Click on the refresh button in the SBT window.
单击 SBT 窗口中的刷新按钮。
Project should load successfully.项目应该成功加载。 Rebuild the project.
重建项目。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.