繁体   English   中英

在scala中将kafkastream激发到cassandra

[英]spark kafkastream to cassandra in scala

我正在尝试使用Scala将Kafka流Json数据插入到我的Cassandra中,但不幸的是卡住了。 我的代码是:-

val kafkaStream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topics)
val records = kafkaStream.map(_._2)
val collection = records.flatMap(_.split(",")).map(s => event(s(0).toString, s(1).toString))
case class event(vehicleid: String, vehicletype: String)
collection.foreachRDD(x => println(x))
collection.saveToCassandra("traffickeyspace", "test", SomeColumns("vehicleid", "vehicletype"))

我得到的错误是:-

not enough arguments for method saveToCassandra: (implicit connector: com.datastax.spark.connector.cql.CassandraConnector, implicit rwf: com.datastax.spark.connector.writer.RowWriterFactory[event])Unit. Unspecified value parameter rwf. kafkatesting.scala  /SparkRedis/src/com/spark/test  line 48 Scala Problem 

其他错误是:

could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFactory[event]    kafkatesting.scala  /SparkRedis/src/com/spark/test  line 48 Scala Problem

我来自生产者的JSON记录是:-

{"vehicleId":"3a92516d-58a7-478e-9cff-baafd98764a3","vehicleType":"Small Truck","routeId":"Route-37","longitude":"-95.30818","latitude":"33.265877","timestamp":"2018-03-28 06:21:47","speed":58.0,"fuelLevel":25.0}

实际上,您无法在所在位置声明案例类。 必须在顶级范围内定义案例类,以获取所需的TypeTag 在此处查看更多详细信息: Scala-没有可用的TypeTag使用案例类尝试获取TypeTag时发生异常吗?

所以,你的情况类移动到你在文件的顶部水平范围。这样,它得到它TypeTag ,使其能够得到它的ColumnMapper这使得它能够拿起它的implicit RowWriterFactor

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM