繁体   English   中英

过滤kafka消息时Spark作业失败

[英]Spark job fails while filtering kafka messages

需要通过检查消息是否需要字段来检查发送到Kafka事件消息是否有效,如果是,则将数据推送到Elasticsearch 这就是我做的方式:

object App {

  val parseJsonStream = (inStream: RDD[String]) => {
    inStream.flatMap(json => {
      try {
        val parsed = parse(json)
        Option(parsed)
      } catch {
        case e: Exception => System.err.println("Exception while parsing JSON: " + json)
          e.printStackTrace()
          None
      }
    }).flatMap(v => {
      if (v.values.isInstanceOf[List[Map[String, Map[String, Any]]]])
        v.values.asInstanceOf[List[Map[String, Map[String, Any]]]]
      else if (v.values.isInstanceOf[Map[String, Map[String, Any]]])
        List(v.values.asInstanceOf[Map[String, Map[String, Any]]])
      else {
        System.err.println("EVENT WRONG FORMAT: " + v.values)
        List()
      }
    }).flatMap(mapa => {
      val h = mapa.get("header")
      val b = mapa.get("body")
      if (h.toSeq.toString.contains("session.end") && !b.toSeq.toString.contains("duration")) {
        System.err.println("session.end HAS NO DURATION FIELD!")
        None
      }
      else if (h.isEmpty || h.get.get("userID").isEmpty || h.get.get("timestamp").isEmpty) {
        throw new Exception("FIELD IS MISSING")
        None
      }
      else {
        Some(mapa)
      }
    })
  }

  val kafkaStream: InputDStream[ConsumerRecord[String, String]] = KafkaUtils.createDirectStream[String, String](
    ssc, PreferBrokers, Subscribe[String, String](KAFKA_EVENT_TOPICS, kafkaParams)
  )
  val kafkaStreamParsed = kafkaStream.transform(rdd => {
    val eventJSON = rdd.map(_.value)
    parseJsonStream(eventJSON)
  }
  )

  val esEventsStream = kafkaStreamParsed.map(addElasticMetadata(_))

  try {
    EsSparkStreaming.saveToEs(esEventsStream, ELASTICSEARCH_EVENTS_INDEX + "_{postfix}" + "/" + ELASTICSEARCH_TYPE, Map("es.mapping.id" -> "docid")
    )
  } catch {
    case e: Exception =>
      EsSparkStreaming.saveToEs(esEventsStream, ELASTICSEARCH_FAILED_EVENTS)
      e.printStackTrace()
  }
}

我猜有人正在发送无效事件(这就是我为什么要进行此检查的原因),但是Spark job没有跳过消息,它失败并显示消息:

用户类引发异常:org.apache.spark.SparkException:作业因阶段失败而中止:阶段6.0中的任务2失败4次,最近失败:阶段6.0中丢失任务2.3(TID 190,xxx.xxx.host.xx ,executor 3):java.lang.Exception:FIELD IS MISSING

我怎样才能防止它崩溃而只是跳过消息呢? 它是YARN应用程序,使用:

Spark 2.3.1
Spark-streaming-kafka-0-10_2.11:2.3.1
Scala 2.11.8

而不是这个

throw new Exception("FIELD IS MISSING")
None

就这样做吧

None

抛出此异常会导致程序终止。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM