简体   繁体   中英

Spark Reading Json

I try to make the examples on http://spark.apache.org/docs/latest/sql-programming-guide.html#sql

I tried this command on shell :

val df = sqlContext.read.json("examples/src/main/resources/people.json")

It gives an error : "no input paths specified in job"

Why this problem happens ? How can I deal with it ?

Thanks.

The error means that you didn't start your spark-shell in the right folder since the path is a relative path. To solve the problem, you either need to change your working directory to the spark home directory or change the path to an absolute path.

Let's say if your spark is installed in a folder called /Applications/spark-1.6.1 for mac, then you can either first navigate to that folder in your shell then run spark-shell or run:

scala> val df = sqlContext.read.json("/Applications/spark-1.6.1/examples/src/main/resources/people.json")

scala> df.show()
+----+-------+
| age|   name|
+----+-------+
|null|Michael|
|  30|   Andy|
|  19| Justin|
+----+-------+

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM