[英]how to store scala object onto cassandra table using spark
I have a Scala model class, whose object I want to store it onto Cassandra table.there is a mismatch in the order of column names of Cassandra and Scala class variables declaration. 我有一个Scala模型类,我想将其对象存储到Cassandra表上.Cassandra和Scala类变量声明的列名顺序不匹配。
And there is a additional column in Cassandra table,which is not there in Scala class variables list like (tr_tag Text)
but i am not able to do it. 并且在Cassandra表中有一个附加列,在Scala类变量列表
(tr_tag Text)
如(tr_tag Text)
没有该列,但是我无法做到这一点。
Data is not getting inserted. 没有插入数据。 Please help me to resolve this.
请帮助我解决此问题。
Model scala class: 模型scala类:
class THData() extends Serializable{
var s_id: java.lang.Long = null
var a_id: String = null
var s_typ: String= null
var s_dt: java.util.Date= null
var t_s_id: String= null
var a_s_no: String= null
var avg_sp: java.lang.Float = null
}
Method to insert object into cassandra: 将对象插入cassandra的方法:
def insert(data: THData) {
var em=sc.parallelize(Seq(data))
em.saveToCassandra("ap", "t_s_data")
}
Cassndra table having column name like below: Cassndra表的列名如下:
sid,aid,styp,sdt,tsid,asno,avgsp,tr_tag sid,援助,典型,sdt,tsid,asno,avgsp,tr_tag
I think you should modify the column names as described in the documentation : 我认为您应该按照文档中所述修改列名称:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.