[英]Spark | Could not create FileClient | read json | scala
I am trying to read a json file in windows local machine using spark and scala.我正在尝试使用 spark 和 scala 在 windows 本地机器中读取 json 文件。 I have tried like below :我试过如下:
object JsonTry extends App{
System.setProperty("hadoop.home.dir", "C:\\winutils")
val sparkSession = SparkSession.builder()
.master("local[*]")
.config("some-config", "some-value")
.appName("App Name")
.getOrCreate();
val res = sparkSession.read.json("./src/main/resources/test.json")
res.printSchema()
}
Json file which is under resource folder looks like below :资源文件夹下的 Json 文件如下所示:
{"name":"Some name"}
But i am getting an exception when I run this main class :但是当我运行这个主类时出现异常:
Exception in thread "main" java.io.IOException: Could not create FileClient线程“main”中的异常 java.io.IOException:无法创建 FileClient
Screenshot attached :附截图:
To my surprise this piece of code is working, but i am looking to read json from file directly.令我惊讶的是,这段代码正在运行,但我希望直接从文件中读取 json。
val res = sparkSession.read.option("multiline", true).json(sparkSession.sparkContext.parallelize(Seq("{\\"name\\":\\"name\\"}")))
Please let me know what is causing this issue, as I am not getting any solution .请让我知道是什么导致了这个问题,因为我没有得到任何解决方案。
I tried to read a json file in a similar way but I didnt face any problem, You may try this too .我试图以类似的方式读取一个 json 文件,但我没有遇到任何问题,你也可以试试这个。
object myTest extends App {
val spark : SparkSession = SparkSession.builder()
.appName("MyTest")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val jsonDataDF = spark.read.option("multiline","true").json("/Users/gp/Desktop/temp/test.json")
jsonDataDF.show()
}
Output -输出 -
I/P Data (JSOn) -> I/P 数据 (JSOn) ->
Do let me know if I understood your question properly or not ?请让我知道我是否正确理解了您的问题?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.