簡體   English   中英

ExceptionInInitializerError Spark流式傳輸Kafka

[英]ExceptionInInitializerError Spark Streaming Kafka

我正在嘗試在一個簡單的應用程序中將Spark Streaming連接到Kafka。 我通過Spark文檔中的示例創建了此應用程序。 當我嘗試運行它時,出現這樣的異常:

Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.streaming.dstream.InputDStream.<init>(InputDStream.scala:80)
    at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.<init>(DirectKafkaInputDStream.scala:59)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:147)
    at org.apache.spark.streaming.kafka010.KafkaUtils$.createDirectStream(KafkaUtils.scala:124)
    at producer.KafkaProducer$.main(KafkaProducer.scala:36)
    at producer.KafkaProducer.main(KafkaProducer.scala)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.4
    at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:751)
    at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)

這是我的代碼:

object KafkaProducer {

  def main(args: Array[String]) {

    val spark = SparkSession
      .builder()
      .appName("KafkaSparkStreaming")
      .master("local[*]")
      .getOrCreate()

    val ssc = new StreamingContext(spark.sparkContext, Seconds(3))
    val topics = Array("topic1", "topic2")

    def kafkaParams = Map[String, Object](
      "bootstrap.servers" -> "localhost:9092",
      "key.deserializer" -> classOf[StringDeserializer],
      "value.deserializer" -> classOf[StringDeserializer],
      "group.id" -> "1",
      "auto.offset.reset" -> "latest",
      "enable.auto.commit" -> (false: java.lang.Boolean)
    )

    val lines = KafkaUtils.createDirectStream[String, String](
      ssc,
      LocationStrategies.PreferConsistent,
      ConsumerStrategies.Subscribe[String, String](topics, kafkaParams)
    )
    lines.map(_.key())

    ssc.start()
    ssc.awaitTermination()

我不確定問題出在配置中還是代碼本身,這就是我的build.sbt文件的樣子:

scalaVersion := "2.11.4"

resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"

libraryDependencies ++= Seq(
  "org.apache.kafka" %% "kafka" % "1.1.0",
  "org.apache.spark" %% "spark-core" % "2.3.0",
  "org.apache.spark" %% "spark-sql" % "2.3.0",
  "org.apache.spark" %% "spark-streaming" % "2.3.0",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0"
)

我將不勝感激,因為我無法找出問題所在!

通過跟蹤您遇到的異常的堆棧跟蹤,我們可以發現主要問題是:

引起原因:com.fasterxml.jackson.databind.JsonMappingException:不兼容的Jackson版本:2.9.4

事實上

Spark 2.1.0包含com.fasterxml.jackson.core作為傳遞依賴項。 因此,我們不需要在libraryDependencies中包含那么。

此處針對類似問題及其解決方案進行了更詳細的描述。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM