简体   繁体   English

对象StreamingContext不是包org.apache.spark的成员[错误]导入org.apache.spark.StreamingContext

[英]object StreamingContext is not a member of package org.apache.spark [error] import org.apache.spark.StreamingContext

I am trying to run nc wordcount program in spark on SBT, and i am getting below error with log my spark version :- 1.6.3 and scala version is 2.10.0 我试图在SBT上的spark中运行nc wordcount程序,并且我的日志版本出现以下错误:-1.6.3和scala版本为2.10.0

  [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
[warn]  * commons-net:commons-net:2.2 is selected over 3.1
[warn]      +- org.apache.spark:spark-core_2.10:1.6.3             (depends on 2.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.2.0              (depends on 3.1)
[warn]  * com.google.guava:guava:14.0.1 is selected over 11.0.2
[warn]      +- org.apache.curator:curator-recipes:2.4.0           (depends on 14.0.1)
[warn]      +- org.tachyonproject:tachyon-client:0.8.2            (depends on 14.0.1)
[warn]      +- org.apache.curator:curator-client:2.4.0            (depends on 14.0.1)
[warn]      +- org.tachyonproject:tachyon-underfs-hdfs:0.8.2      (depends on 14.0.1)
[warn]      +- org.apache.curator:curator-framework:2.4.0         (depends on 14.0.1)
[warn]      +- org.tachyonproject:tachyon-underfs-s3:0.8.2        (depends on 14.0.1)
[warn]      +- org.tachyonproject:tachyon-underfs-local:0.8.2     (depends on 14.0.1)
[warn]      +- org.apache.hadoop:hadoop-hdfs:2.2.0                (depends on 11.0.2)
[warn]      +- org.apache.hadoop:hadoop-common:2.2.0              (depends on 11.0.2)
[warn]  * com.google.code.findbugs:jsr305:1.3.9 is selected over 2.0.1
[warn]      +- com.google.guava:guava:11.0.2                      (depends on 1.3.9)
[warn]      +- org.apache.spark:spark-core_2.10:1.6.3             (depends on 1.3.9)
[warn]      +- org.apache.spark:spark-unsafe_2.10:1.6.3           (depends on 1.3.9)
[warn]      +- org.apache.spark:spark-network-common_2.10:1.6.3   (depends on 1.3.9)
[warn]      +- com.fasterxml.jackson.module:jackson-module-scala_2.10:2.4.4 (depends on 2.0.1)
[warn] Run 'evicted' to see detailed eviction warnings
[info] Compiling 1 Scala source to /home/training/Desktop/SBT/sbt/bin/sparknc/target/scala-2.10/classes ...
[error] /home/training/Desktop/SBT/sbt/bin/sparknc/src/main/scala/sparkstreaming.scala:2:8: object StreamingContext is not a member of package org.apache.spark
[error] import org.apache.spark.StreamingContext
[error]        ^
[error] /home/training/Desktop/SBT/sbt/bin/sparknc/src/main/scala/sparkstreaming.scala:6:56: value setApplication is not a member of org.apache.spark.SparkConf
[error] val mysparkconf= new SparkConf().setMaster("local[2]").setApplication("My networking application")
[error]                                                        ^
[error] /home/training/Desktop/SBT/sbt/bin/sparknc/src/main/scala/sparkstreaming.scala:7:27: not found: type StreamingContext
[error] val streamingcontext= new StreamingContext(mysparkconf, seconds(2))
[error]                           ^

[error] three errors found [error] (Compile / compileIncremental) Compilation failed [错误]找到三个错误[错误](编译/ compileIncremental)编译失败

There is need to add the exact dependency of spark-streaming into your build.sbt file. 需要将spark-streaming的确切依赖性添加到build.sbt文件中。

Set scala version as 2.10.5 将scala版本设置为2.10.5

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.6.3" % "provided"

Just suggestion use latested spark 2.3.1,with new features most of the dependency issues resolved 仅建议使用最新的spark 2.3.1,并具有新功能,解决了大多数依赖关系问题

Adding a link for fixed issues spark docs 添加已解决问题的链接Spark Docs

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 错误:对象StreamingContext不是包的成员org.apache.spark.streaming import org.apache.spark.streaming.StreamingContext - error: object StreamingContext is not a member of package org.apache.spark.streaming import org.apache.spark.streaming.StreamingContext scala -object sql不是软件包org.apache.spark的成员 - scala -object sql is not a member of package org.apache.spark 对象 sql 不是包 org.apache.spark 的成员 - Object sql is not a member of package org.apache.spark 对象流不是包 org.apache.spark 的成员 - Object streaming is not a member of package org.apache.spark spark.read不是软件包org.apache.spark的成员 - spark.read is not a member of package org.apache.spark 为什么Scala编译器失败,“包中的对象SparkConf无法在org.apache.spark包中访问”? - Why does Scala compiler fail with “object SparkConf in package spark cannot be accessed in package org.apache.spark”? 在 Intellij 中运行 Spark 时出错:“object apache is not a member of package org” - Error in running Spark in Intellij : "object apache is not a member of package org" 错误:对象DataFrame不是包org.apache.spark.sql的成员 - error: object DataFrame is not a member of package org.apache.spark.sql Maven 构建错误(Scala + Spark):object apache 不是 ZEFE90A8E604A7C8DBB7D88 的成员 - Maven build ERROR (Scala + Spark):object apache is not a member of package org sbt 错误:object spark 不是 package org.ZB6EFD606D118D0F62066E31ZCC19 的成员 - sbt error: object spark is not a member of package org.apache
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM