简体   繁体   中英

Spark and Cassandra: requirement failed: Columns not found in class com.datastax.spark.connector.japi.CassandraRow: [mycolumn...]

I have a CassandraRow object that contains values of a row. I read it from a one table. I want to write that same object to another table. But then I get this error:

requirement failed: Columns not found in class com.datastax.spark.connector.japi.CassandraRow: [myColumn1, myColumns2, ...]

I tried to pass my own mapping by creating a Map and passing it in the function. This is my code:

CassandraRow row = fetch();

Map<String, String> mapping = Map.of("myColumn1", "myColumn1", "myColumns2", "myColumns2"....);

JavaSparkContext ctx = new JavaSparkContext(conf);

JavaRDD<CassandraRow> insightRDD = ctx.parallelize(List.of(row));

CassandraJavaUtil.javaFunctions(insightRDD).writerBuilder("mykeyspace", "mytable",
            CassandraJavaUtil.mapToRow(CassandraRow.class, mapping)).saveToCassandra(); //I also tried without mapping

Any help is appreciated. I have tried POJO approach and it is working. But I don't want to be restricted to creating POJOs. I want a generic approach that would work with any table and any row.

I could not find a way to generalize my solution using Apache Spark. So I use Datastax Java Driver for Apache Cassandra and wrote SQL queries. That was generic enough for me.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM