简体   繁体   中英

How to create RDD object on cassandra data using pyspark

I am using cassandra 2.0.3 and I would like to use pyspark (Apache Spark Python API) to create an RDD object from cassandra data.

PLEASE NOTE: I do not want to do import CQL and then CQL query from pyspark API rather I would like to create an RDD on which I woud like to do some transformations.

I know this can be done in Scala but I am not able to find out how this could be done from pyspark.

Really appreciate if anyone could guide me on this.

Might not be relevant to you anymore, but I was looking for the same thing and couldn't find anything which I was happy with. So I did some work on this: https://github.com/TargetHolding/pyspark-cassandra . Needs a lot of testing before use in production, but I think the integration works quite nicely.

我不确定你是否看过这个例子https://github.com/apache/spark/blob/master/examples/src/main/python/cassandra_inputformat.py我读过Cassandra使用类似的模式

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM