简体   繁体   English

在火花流中解析 json

[英]Parsing json in spark-streaming

I'm pretty new to spark and I'm trying to receive a DStream structured as a json from a kafka topic and I want to parse the content of each json.我对 spark 很陌生,我正在尝试从 kafka 主题接收结构为 json 的 DStream,我想解析每个 json 的内容。 The json I receive is something like this:我收到的 json 是这样的:

{"type":"position","ident":"IBE32JZ","air_ground":"A","alt":"34000","clock":"1409733420","id":"IBE32JZ-1409715361-ed-0002:0","gs":"446","heading":"71","lat":"44.50987","lon":"2.98972","reg":"ECJRE","squawk":"1004","updateType":"A","altChange":" "}

I'm trying to extract the ident field only, at least for now and I'm using lift-json library to parse the data.我正在尝试仅提取 ident 字段,至少现在是这样,并且我正在使用 lift-json 库来解析数据。 My program looks like this:我的程序如下所示:

object ScalaExample {
    val kafkaHost = "localhost"
    val kafkaPort = 9092
    val zookeeperHost = "localhost"
    val zookeeperPort = 2181

    implicit val formats = DefaultFormats
    case class PlaneInfo(ident: String)


    def parser(json: String): String = {
        val parsedJson = parse(json)
        val m = paso1.extract[PlaneInfo]
        return m.ident
    }

    def main(args : Array[String]) {
        val zkQuorum = "localhost:2181"
        val group = "myGroup"
        val topic = Map("flightStatus" -> 1)
        val sparkContext = new SparkContext("local[4]", "KafkaConsumer")
        val ssc = new StreamingContext(sparkContext, Seconds(10))


        val json = KafkaUtils.createStream(ssc, zkQuorum, group, topic)

        val id = json.map(_._2).map(parser)

        id.print

        ssc.start()
 }
}

but it throws me the exception below:但它向我抛出了以下异常:

java.lang.NoClassDefFoundError: scala/reflect/ClassManifest
    at net.liftweb.json.JsonAST$JValue.extract(JsonAST.scala:300)
    at aero.catec.stratio.ScalaExample$.parser(ScalaExample.scala:33)
    at aero.catec.stratio.ScalaExample$$anonfun$2.apply(ScalaExample.scala:48)
    at aero.catec.stratio.ScalaExample$$anonfun$2.apply(ScalaExample.scala:48)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
    at scala.collection.Iterator$$anon$10.next(Iterator.scala:312)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
    at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
    at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
    at scala.collection.AbstractIterator.to(Iterator.scala:1157)
    at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
    at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
    at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
    at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
    at org.apache.spark.rdd.RDD$$anonfun$28.apply(RDD.scala:1003)
    at org.apache.spark.rdd.RDD$$anonfun$28.apply(RDD.scala:1003)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1083)
    at org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1083)
    at org.apache.spark.scheduler.DAGScheduler.runLocallyWithinThread(DAGScheduler.scala:575)
    at org.apache.spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:560)
Caused by: java.lang.ClassNotFoundException: scala.reflect.ClassManifest
    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

The thing is that if a run the same without using spark (reading from a file) it works perfectly.问题是,如果在不使用 spark(从文件中读取)的情况下运行相同的程序,它会完美运行。 The problem starts when I try to put it in a spark program.当我尝试将其放入火花程序时,问题就开始了。 Also, if I change the parser function to something like this:另外,如果我将解析器函数更改为如下所示:

def parser(json: String): JValue = {
  val parsedJson = parse(json)
  return (parsedJson \\ "ident")
}

It also works.它也有效。 But when I try to extract the actual String, I get the same error.但是当我尝试提取实际的字符串时,我得到了同样的错误。

Thank you for your help.谢谢您的帮助。 I hope I had explained it well.我希望我已经解释得很好。

this happens because you are missing a scala reflect dependence needed to serialize/deserialize the record.发生这种情况是因为您缺少序列化/反序列化记录所需的 scala 反映依赖关系。 Try to add the scala reflect jar that match the spark version.尝试添加与 spark 版本匹配的 scala reflect jar。

Tip: "org.scala-lang" % "scala-reflect" % Version.scala提示: "org.scala-lang" % "scala-reflect" % Version.scala

Oh, a good old issue :-)哦,一个很好的老问题:-)

Basically this indicates a version problem: one of your dependencies is not compatible with the Scala compiler you are currently using.基本上这表明存在版本问题:您的依赖项之一与您当前使用的 Scala 编译器不兼容。 Are you on 2.10?你是2.10吗?

Try Googling the phrase "NoClassDefFoundError: scala/reflect/ClassManifest" , I'm sure you will find more than enough description about the issue.尝试用谷歌搜索“NoClassDefFoundError: scala/reflect/ClassManifest”这个短语,我相信你会找到关于这个问题的足够多的描述。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM