繁体   English   中英

错误:值toDF不是org.apache.spark.rdd.RDD [org.apache.kafka.clients.consumer.ConsumerRecord [String,String]]的成员

[英]error: value toDF is not a member of org.apache.spark.rdd.RDD[org.apache.kafka.clients.consumer.ConsumerRecord[String,String]]

我正在尝试使用Scala中的sparkStreaming捕获Kafka事件(以序列化的形式获取)。

这是我的代码段:

val spark = SparkSession.builder().master("local[*]").appName("Spark-Kafka-Integration").getOrCreate()
spark.conf.set("spark.driver.allowMultipleContexts", "true")

val sc = spark.sparkContext
val ssc = new StreamingContext(sc, Seconds(5))

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._

val topics=Set("<topic-name>")
val brokers="<some-list>"
val groupId="spark-streaming-test"

val kafkaParams = Map[String, Object](
  "bootstrap.servers" -> brokers,
  "auto.offset.reset" -> "earliest",
  "key.deserializer" -> classOf[StringDeserializer],
  "value.deserializer" -> "org.apache.kafka.common.serialization.StringDeserializer",
  "group.id" -> groupId,
  "enable.auto.commit" -> (false: java.lang.Boolean)
)

val messages: InputDStream[ConsumerRecord[String, String]] =
  KafkaUtils.createDirectStream[String, String](
    ssc,
    LocationStrategies.PreferConsistent,
    ConsumerStrategies.Subscribe[String, String](topics, kafkaParams)
  )

messages.foreachRDD { rdd =>
  println(rdd.toDF())
}

ssc.start()
ssc.awaitTermination()

我收到以下错误消息:Error:(59,19)value toDF不是org.apache.spark.rdd.RDD [org.apache.kafka.clients.consumer.ConsumerRecord [String,String]]的成员rdd.toDF())

toDF通过DatasetHolder

https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.SQLImplicits

我没有复制它,但是我的猜测是没有用于ConsumerRecord[String, String]编码器ConsumerRecord[String, String]因此您可以提供一个Encoder也可以先将其映射到可以派生Encoder (案例类或基元)

而且由于spark的分布式特性, foreachRDD println可能不会按照您想要的方式运行

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM