简体   繁体   English

sbt 错误:object spark 不是 package org.ZB6EFD606D118D0F62066E31ZCC19 的成员

[英]sbt error: object spark is not a member of package org.apache

I installed sbt-1.3.4.msi and when trying to build a sample SparkPi.scala app, I'm getting the following error:我安装了sbt-1.3.4.msi并尝试构建示例SparkPi.scala应用程序时,出现以下错误:

C:\myapps\sbt\sparksample>sbt
[info] Loading project definition from C:\myapps\sbt\sparksample\project
[info] Compiling 1 Scala source to C:\myapps\sbt\sparksample\project\target\scala-2.12\sbt-1.0\classes ...
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:3:19: object spark is not a member of package org.apache
[error] import org.apache.spark._
[error]                   ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:8:20: not found: type SparkConf
[error]     val conf = new SparkConf().setAppName("Spark Pi")
[error]                    ^
[error] C:\myapps\sbt\sparksample\project\src\main\scala\SparkPi.scala:9:21: not found: type SparkContext
[error]     val spark = new SparkContext(conf)
[error]                     ^
[error] three errors found
[error] (Compile / compileIncremental) Compilation failed

The SparkPi.scala file is in C:\myapps\sbt\sparksample\project\src\main\scala (as shown in the error messages above). SparkPi.scala文件位于C:\myapps\sbt\sparksample\project\src\main\scala中(如上面的错误消息所示)。

What am I missing here?我在这里想念什么?

The C:\myapps\sbt\sparksample\sparksample.sbt file is as follows: C:\myapps\sbt\sparksample\sparksample.sbt文件如下:

name := "Spark Sample"

version := "1.0"

scalaVersion := "2.12.10"

libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.0"

C:\myapps\sbt\sparksample\project\src\main\scala directory has SparkPi.scala file C:\myapps\sbt\sparksample\project\src\main\scala 目录下有 SparkPi.scala 文件

That's the problem.那就是问题所在。 You've got the Scala file(s) under project directory that's owned by sbt itself (not your sbt-managed Scala project).您在 sbt 本身拥有的project目录下拥有 Scala 文件(不是您的 sbt 管理的 Scala 项目)。

Move the SparkPi.scala and other Scala files to C:\myapps\sbt\sparksample\src\main\scala .SparkPi.scala和其他 Scala 文件移动到C:\myapps\sbt\sparksample\src\main\scala

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 sbt组装失败并显示错误:即使包含spark-core和spark-sql库,对象spark也不是org.apache包的成员 - sbt assembly failing with error: object spark is not a member of package org.apache even though spark-core and spark-sql libraries are included object kafka不是包org.apache的成员 - object kafka is not a member of package org.apache 运行 sbt 包时出错:对象 apache 不是包 org 的成员 - Error while running sbt package: object apache is not a member of package org sbt - 对象apache不是包org的成员 - sbt - object apache is not a member of package org 错误:对象DataFrame不是包org.apache.spark.sql的成员 - error: object DataFrame is not a member of package org.apache.spark.sql Maven 构建错误(Scala + Spark):object apache 不是 ZEFE90A8E604A7C8DBB7D88 的成员 - Maven build ERROR (Scala + Spark):object apache is not a member of package org 在 Intellij 中运行 Spark 时出错:“object apache is not a member of package org” - Error in running Spark in Intellij : "object apache is not a member of package org" 对象StreamingContext不是包org.apache.spark的成员[错误]导入org.apache.spark.StreamingContext - object StreamingContext is not a member of package org.apache.spark [error] import org.apache.spark.StreamingContext (Spark)对象{name}不是包org.apache.spark.ml的成员 - (Spark) object {name} is not a member of package org.apache.spark.ml Spark-shell错误对象映射不是包org.apache.spark.streaming.rdd的成员 - Spark-shell Error object map is not a member of package org.apache.spark.streaming.rdd
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM