[英]Loading data from GCS using Spark Local
I am trying to read data from GCS buckets on my local machine, for testing purposes. 我正在尝试从本地计算机上的GCS存储桶读取数据,以进行测试。 I would like to sample some of the data in the cloud I have downloaded the GCS Hadoop Connector JAR . 我想在我下载了GCS Hadoop Connector JAR的云中采样一些数据。
And setup the sparkConf
as follow: 并设置sparkConf
,如下所示:
conf = SparkConf() \
.setMaster("local[8]") \
.setAppName("Test") \
.set("spark.jars", "path/gcs-connector-hadoop2-latest.jar") \
.set("spark.hadoop.google.cloud.auth.service.account.enable", "true") \
.set("spark.hadoop.google.cloud.auth.service.account.json.keyfile", "path/to/keyfile")
sc = SparkContext(conf=conf)
spark = SparkSession.builder \
.config(conf=sc.getConf()) \
.getOrCreate()
spark.read.json("gs://gcs-bucket")
I have also tried to set the conf like so: 我也试图像这样设置conf:
sc._jsc.hadoopConfiguration().set("fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS")
sc._jsc.hadoopConfiguration().set("fs.gs.auth.service.account.json.keyfile", "path/to/keyfile")
sc._jsc.hadoopConfiguration().set("fs.gs.auth.service.account.enable", "true")
I am using PySpark install via PIP and running the code using the unit test module from IntelliJ 我正在通过PIP使用PySpark安装并使用IntelliJ中的单元测试模块运行代码
py4j.protocol.Py4JJavaError: An error occurred while calling o128.json.
: java.io.IOException: No FileSystem for scheme: gs
What should I do? 我该怎么办?
Thanks! 谢谢!
要解决此问题,除了已经配置的属性外,还需要为fs.gs.impl
属性添加配置:
sc._jsc.hadoopConfiguration().set("fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem")
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.