简体   繁体   English

Spark 和 Cassandra:要求失败:在 com.datastax.spark.connector.japi.CassandraRow 类中找不到列:[mycolumn...]

[英]Spark and Cassandra: requirement failed: Columns not found in class com.datastax.spark.connector.japi.CassandraRow: [mycolumn...]

I have a CassandraRow object that contains values of a row.我有一个包含行值的 CassandraRow 对象。 I read it from a one table.我从一张桌子上读到它。 I want to write that same object to another table.我想将同一个对象写入另一个表。 But then I get this error:但是后来我收到了这个错误:

requirement failed: Columns not found in class com.datastax.spark.connector.japi.CassandraRow: [myColumn1, myColumns2, ...]要求失败:在 com.datastax.spark.connector.japi.CassandraRow 类中找不到列:[myColumn1, myColumns2, ...]

I tried to pass my own mapping by creating a Map and passing it in the function.我试图通过创建一个 Map 并将其传递到函数中来传递我自己的映射。 This is my code:这是我的代码:

CassandraRow row = fetch();

Map<String, String> mapping = Map.of("myColumn1", "myColumn1", "myColumns2", "myColumns2"....);

JavaSparkContext ctx = new JavaSparkContext(conf);

JavaRDD<CassandraRow> insightRDD = ctx.parallelize(List.of(row));

CassandraJavaUtil.javaFunctions(insightRDD).writerBuilder("mykeyspace", "mytable",
            CassandraJavaUtil.mapToRow(CassandraRow.class, mapping)).saveToCassandra(); //I also tried without mapping

Any help is appreciated.任何帮助表示赞赏。 I have tried POJO approach and it is working.我已经尝试过 POJO 方法并且它正在工作。 But I don't want to be restricted to creating POJOs.但我不想局限于创建 POJO。 I want a generic approach that would work with any table and any row.我想要一种适用于任何表和任何行的通用方法。

I could not find a way to generalize my solution using Apache Spark.我找不到使用 Apache Spark 概括我的解决方案的方法。 So I use Datastax Java Driver for Apache Cassandra and wrote SQL queries.所以我使用 Datastax Java Driver for Apache Cassandra 并编写了 SQL 查询。 That was generic enough for me.这对我来说已经足够通用了。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用cassandra连接器在apache spark 2.0.2上运行作业时,无法初始化com.datastax.spark.connector.types.TypeConverter $类 - Could not initialize class com.datastax.spark.connector.types.TypeConverter$ while running job on apache spark 2.0.2 using cassandra connector Spark cassandra连接器Java API,找不到符号CassandraRow - Spark cassandra connector Java API, cannot find symbol CassandraRow pyspark中的com.datastax.spark:spark-cassandra-connector_2.11:2.5.1软件包出现问题 - Problem with package com.datastax.spark:spark-cassandra-connector_2.11:2.5.1 in pyspark Spark java.lang.NoClassDefFoundError中的spark-cassandra-connector错误:com / datastax / driver / core / ProtocolOptions $ Compression - Error with spark-cassandra-connector in Spark java.lang.NoClassDefFoundError: com/datastax/driver/core/ProtocolOptions$Compression 使用Datastax Spark Cassandra Connector将PairDStram写入cassandra - Write PairDStram to cassandra using Datastax Spark Cassandra Connector 使用DataStax Spark Connector在Cassandra中保存空值 - Save null Values in Cassandra using DataStax Spark Connector 导入 com.datastax.spark.connector.CassandraJavaUtil 无法解析 - The import com.datastax.spark.connector.CassandraJavaUtil cannot be resolved 使用DatasTax Spark-Cassandra Java连接器运行Spark和Cassandra时出错 - Error running spark and cassandra using datastax spark-cassandra java connector Spark Cassandra 连接器是否有替代品? - Is there an alternative to the Spark Cassandra connector? 使用火花流没有找到Cassandra类 - Cassandra class not found using spark streaming
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM