简体   繁体   中英

how to store scala object onto cassandra table using spark

I have a Scala model class, whose object I want to store it onto Cassandra table.there is a mismatch in the order of column names of Cassandra and Scala class variables declaration.

And there is a additional column in Cassandra table,which is not there in Scala class variables list like (tr_tag Text) but i am not able to do it.

Data is not getting inserted. Please help me to resolve this.

Model scala class:

class THData() extends Serializable{
     var s_id: java.lang.Long = null
     var a_id: String = null
     var s_typ: String= null
     var s_dt: java.util.Date= null
     var t_s_id: String= null
     var a_s_no: String= null
     var avg_sp: java.lang.Float = null
}

Method to insert object into cassandra:

def insert(data: THData) {
     var em=sc.parallelize(Seq(data))
     em.saveToCassandra("ap", "t_s_data")
}

Cassndra table having column name like below:

sid,aid,styp,sdt,tsid,asno,avgsp,tr_tag

I think you should modify the column names as described in the documentation :

在此处输入图片说明

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM