简体   繁体   中英

Spark SQL json error

I have tried to create jsonrdd using sql context but getting the below error.. I have placed the code and error below.. please suggest..

My code:

rdd1=sqlContext.jsonRDD('/idn/home/ijamalmo/ibfiles/datafiles/sampledata.json')

Error

Traceback (most recent call last):

File "", line 1, in

File "/opt/mapr/spark/spark-1.4.1/python/pyspark/sql/context.py", line 442, in jsonRDD

 keyed = rdd.mapPartitions(func) 

AttributeError: 'str' object has no attribute 'mapPartitions'

jsonRDD is not a pyspark function. You want:

rdd1 = sqlContext.read.json('fileLoc').rdd

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM