简体   繁体   English

Spark- 将 JavaRDD 保存到 Cassandra

[英]Spark- Saving JavaRDD to Cassandra

This link shows a way to save a JavaRDD to Cassandra in this way:链接显示了一种以这种方式将JavaRDD保存到Cassandra的方法:

import static com.datastax.spark.connector.CassandraJavaUtil.*;

JavaRDD<Product> productsRDD = sc.parallelize(products);
javaFunctions(productsRDD, Product.class).saveToCassandra("java_api", "products");

But the com.datastax.spark.connector.CassandraJavaUtil.* seems deprecated.但是com.datastax.spark.connector.CassandraJavaUtil.*似乎已弃用。 The updated API should be:更新后的 API 应该是:

import static com.datastax.spark.connector.japi.CassandraJavaUtil.*;

Can someone show me some codes to store a JavaRDD to Cassandra using the updated API above?有人可以向我展示一些代码来使用上面更新的 API 将JavaRDD存储到Cassandra吗?

按照文档,应该是这样的:

javaFunctions(rdd).writerBuilder("ks", "people", mapToRow(Person.class)).saveToCassandra();

replace代替

JavaRDD<Product> productsRDD = sc.parallelize(products);
javaFunctions(productsRDD, Product.class).saveToCassandra("java_api", "products »);

by经过

JavaRDD<Product> productsRDD = javaFunctions(sc).cassandraTable("java_api", "products", mapRowTo(Product.class));

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM