I am trying to read a json file in windows local machine using spark and scala. I have tried like below :
object JsonTry extends App{
System.setProperty("hadoop.home.dir", "C:\\winutils")
val sparkSession = SparkSession.builder()
.master("local[*]")
.config("some-config", "some-value")
.appName("App Name")
.getOrCreate();
val res = sparkSession.read.json("./src/main/resources/test.json")
res.printSchema()
}
Json file which is under resource folder looks like below :
{"name":"Some name"}
But i am getting an exception when I run this main class :
Exception in thread "main" java.io.IOException: Could not create FileClient
Screenshot attached :
To my surprise this piece of code is working, but i am looking to read json from file directly.
val res = sparkSession.read.option("multiline", true).json(sparkSession.sparkContext.parallelize(Seq("{\\"name\\":\\"name\\"}")))
Please let me know what is causing this issue, as I am not getting any solution .
I tried to read a json file in a similar way but I didnt face any problem, You may try this too .
object myTest extends App {
val spark : SparkSession = SparkSession.builder()
.appName("MyTest")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val jsonDataDF = spark.read.option("multiline","true").json("/Users/gp/Desktop/temp/test.json")
jsonDataDF.show()
}
Output -
I/P Data (JSOn) ->
Do let me know if I understood your question properly or not ?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.