简体   繁体   English

在 Intellij 中运行 Spark 时出错:“object apache is not a member of package org”

[英]Error in running Spark in Intellij : "object apache is not a member of package org"

I am running a Spark program on Intellij and getting the below error : "object apache is not a member of package org".我在 Intellij 上运行 Spark 程序并收到以下错误:“object apache is not a member of package org”。

I have used these import statement in the code :我在代码中使用了这些导入语句:

import org.apache.spark.SparkContext  
import org.apache.spark.SparkContext._  
import org.apache.spark.SparkConf

The above import statement is not running on sbt prompt too.上面的导入语句也没有在 sbt 提示符下运行。 The corresponding lib appears to be missing but I am not sure how to copy the same and at which path.相应的库似乎丢失了,但我不确定如何复制相同的库以及在哪个路径上复制。

Make sure you have entries like this in SBT: 确保在SBT中有这样的条目:

scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0", 
  "org.apache.spark" %% "spark-sql" % "2.1.0" 
)

Then make sure IntelliJ knows about these libraries by either enabling "auto-import" or doing it manually by clicking the refresh-looking button on the SBT panel. 然后确保IntelliJ通过启用“自动导入”或通过单击SBT面板上的刷新外观按钮手动执行来了解这些库。

It is about 5 years since the previous answer, but I had the same issue and the answer mentioned here did not work.距离上一个答案已经过去了大约 5 年,但我遇到了同样的问题,这里提到的答案不起作用。 So, hopefully this answer works for those who find themselves in the same position I was in.所以,希望这个答案适用于那些发现自己处于与我相同位置的人。

I was able to run my scala program from sbt shell, but it was not working in Intellij.我能够从 sbt shell 运行我的 scala 程序,但它在 Intellij 中不起作用。 This is what I did to fix the issue:这是我为解决此问题所做的工作:

  1. Imported the build.sbt file as a project.将 build.sbt 文件作为项目导入。

File -> Open -> select build.sbt -> choose the "project" option.文件 -> 打开 -> 选择 build.sbt -> 选择“项目”选项。

  1. Install the sbt plugin and reload Intellij.安装 sbt 插件并重新加载 Intellij。

File -> settings -> Plugins -> search and install sbt.文件 -> 设置 -> 插件 -> 搜索并安装 sbt。

  1. Run sbt.运行 sbt。

Click "View" -> Tool Windows -> sbt.单击“查看”-> 工具窗口-> sbt。 Click on the refresh button in the SBT window.单击 SBT 窗口中的刷新按钮。

Project should load successfully.项目应该成功加载。 Rebuild the project.重建项目。

  1. Select your file and click "Run".选择您的文件,然后单击“运行”。 It should ideally work.它应该理想地工作。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 运行 sbt 包时出错:对象 apache 不是包 org 的成员 - Error while running sbt package: object apache is not a member of package org 错误:对象DataFrame不是包org.apache.spark.sql的成员 - error: object DataFrame is not a member of package org.apache.spark.sql Maven 构建错误(Scala + Spark):object apache 不是 ZEFE90A8E604A7C8DBB7D88 的成员 - Maven build ERROR (Scala + Spark):object apache is not a member of package org sbt 错误:object spark 不是 package org.ZB6EFD606D118D0F62066E31ZCC19 的成员 - sbt error: object spark is not a member of package org.apache 对象StreamingContext不是包org.apache.spark的成员[错误]导入org.apache.spark.StreamingContext - object StreamingContext is not a member of package org.apache.spark [error] import org.apache.spark.StreamingContext 在终端中运行 Scala 时出错:“对象 apache 不是包 org 的成员” - Error in running Scala in terminal: "object apache is not a member of package org" (Spark)对象{name}不是包org.apache.spark.ml的成员 - (Spark) object {name} is not a member of package org.apache.spark.ml Spark-shell错误对象映射不是包org.apache.spark.streaming.rdd的成员 - Spark-shell Error object map is not a member of package org.apache.spark.streaming.rdd XGBoost4J-Spark 错误 - 对象 dmlc 不是包 org.apache.spark.ml 的成员 - XGBoost4J-Spark Error - object dmlc is not a member of package org.apache.spark.ml 错误:对象StreamingContext不是包的成员org.apache.spark.streaming import org.apache.spark.streaming.StreamingContext - error: object StreamingContext is not a member of package org.apache.spark.streaming import org.apache.spark.streaming.StreamingContext
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM