简体   繁体   English

通过Spark-SQL获取JSON数据

[英]fetching JSON data by spark-SQL

When i try to fetch the nested JSON data by using Spark-SQL query: 当我尝试通过使用Spark-SQL查询获取嵌套的JSON数据时:

SparkContext sc = new SparkContext(new SparkConf().setAppName("sql").setMaster("local"));
SQLContext  sqlContext = new SQLContext(sc);
DataFrame df = sqlContext.read().json("path_to_s3_bucket").cache();
df.registerTempTable("table_name");
DataFrame d=sqlContext.sql("Select address.state as state from table_name");

I am getting th following exception 我得到以下例外

Exception in thread "main" org.apache.spark.sql.AnalysisException: Can't extract value from address

My Json data is like:- 我的Json数据如下:-

"address":{"city":"xyz","state":"abc","country":"pqr"}

Please help in resolving the issue. 请帮助解决问题。

Your json is invalid. 您的json无效。 It should be: {"address":{"city":"xyz","state":"abc","country":"pqr"}} 它应该是:{“ address”:{“ city”:“ xyz”,“ state”:“ abc”,“ country”:“ pqr”}}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM