简体   繁体   English

saveTocassandra找不到参数rwf的隐含值

[英]saveTocassandra could not find implicit value for parameter rwf

I'm trying to save a dataset in Cassandra database using spark scala, But I am getting an exception while running a code: link used: http://rustyrazorblade.com/2015/01/introduction-to-spark-cassandra/ 我正在尝试使用spark scala在Cassandra数据库中保存数据集,但我在运行代码时遇到异常:链接使用: http//rustyrazorblade.com/2015/01/introduction-to-spark-cassandra/

error:

could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFctory[FoodToUserIndex]
 food_index.saveToCassandra("tutorial", "food_to_user_index")
                          ^

.scala .scala

def main(args: Array[String]): Unit = {

val conf = new SparkConf(true)
  .set("spark.cassandra.connection.host", "localhost")
  .set("spark.executor.memory", "1g")
  .set("spark.cassandra.connection.native.port", "9042")
val sc = new SparkContext(conf)


case class FoodToUserIndex(food: String, user: String)

val user_table = sc.cassandraTable[CassandraRow]("tutorial",   "user").select("favorite_food","name")

val food_index = user_table.map(r => new   FoodToUserIndex(r.getString("favorite_food"), r.getString("name")))
food_index.saveToCassandra("tutorial", "food_to_user_index")}

build.sbt build.sbt

name := "intro_to_spark"

version := "1.0"

scalaVersion := "2.11.2"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" %  "1.2.0-rc3"

if change the version of scala and cassandra connector to 2.10, 1.1.0 it's work. 如果将scala和cassandra连接器的版本更改为2.10,1.1.0它的工作原理。 but i need use scala 2.11: 但我需要使用scala 2.11:

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" %   "1.1.0" withSources() withJavadoc()

移动case class FoodToUserIndex(food: String, user: String)main函数之外应该解决问题。

It has to do with "datastax spark-cassandra-connector" version and not Scala version. 它与“datastax spark-cassandra-connector”版本有关,而与Scala版本无关。

So far, Version 1.2.x missing saving from custom class. 到目前为止,1.2.x版缺少自定义类的保存。

Try "datastax spark-cassandra-connector" version 1.1.1 with Scala 2.11 and it should work 尝试使用Scala 2.11“datastax spark-cassandra-connector”1.1.1版,它应该可以工作

Note: Make sure to have Spark compiled against Scala 2.11 too. 注意:确保Spark也针对Scala 2.11编译。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM