简体   繁体   中英

Streaming Spark Exception in thread “main”

I'm trying to streaming twitter data using spark with scala using sbt, everything go well but I have a problem:

this is my buld.sbt:

import Assembly._
import AssemblyPlugin._

name := "TwitterSparkStreaming"
version := "0.1"
scalaVersion := "2.12.3"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "1.5.2",
  "org.apache.spark" % "spark-sql_2.11" % "1.5.2",
  "org.apache.spark" % "spark-streaming_2.11" % "1.5.2",
  "org.apache.spark" % "spark-streaming-twitter_2.11" % "1.6.3",
  "joda-time" %% "joda-time" % "2.9.1",
  "org.twitter4j" % "twitter4j-core" % "3.0.3",
  "org.twitter4j" % "twitter4j-stream" % "3.0.3",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.5.2",
  "edu.stanford.nlp" % "stanford-corenlp" % "3.5.2" classifier "models"
)

resolvers += "Akka Repository" at "http://repo.akka.io./releases/"

assemblyMergeStrategy in assembly := {
  case PathList("META-INF", xs @ _*) => MergeStrategy.discard
  case x => MergeStrategy.first
}

This is the class contains org.apache.spark.Logging:

import org.apache.log4j.{Logger, Level}
import org.apache.spark.Logging

object LogUtils extends Logging{
  def setStreamingLogLevels(): Unit ={
    val log4jInitialized = Logger.getRootLogger.getAllAppenders.hasMoreElements
    if(!log4jInitialized)
    {
      logInfo("Setting log level to [WARN] for streaming example." + " To override add a custom log4j.properties to the classpath.")
      Logger.getRootLogger.setLevel(Level.WARN)
    }
  }
}

This is the error keep appear for me:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.Logging.$init$(Lorg/apache/spark/Logging;)V
    at LogUtils$.<init>(LogUtils.scala:4)
    at LogUtils$.<clinit>(LogUtils.scala)
    at TwitterStreaming$.main(TwitterStreaming.scala:30)
    at TwitterStreaming.main(TwitterStreaming.scala)

Can I know how can I fix it?

Note: I tried to change the org.apache.spark dependencies from version 2.2.0 to 1.5.2 but the problem is the same

I am not sure why this block of code is giving error. But there is a better way to set the log level in Spark.

Please refer the link https://spark.apache.org/docs/latest/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

Spark has method on sparkContext level, So you can just call,

sparkContext.setLogLevel("WARN")

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM