val spark = SparkSession
.builder()
.appName("try1")
.master("local")
.getOrCreate()
val df = spark.read
.json("s3n://BUCKET-NAME/FOLDER/FILE.json")
.select($"uid").show(5)
I have given the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY as environment variables. I face below error while trying to read from S3.
Exception in thread "main" org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException: S3 HEAD request failed for '/FOLDER%2FFILE.json' - ResponseCode=400, ResponseMessage=Bad Request
I suspect the error is caused due to "/" being converted to "%2F" by some internal function as the error shows '/FOLDER%2FFILE.json' instead of '/FOLDER/FILE.json'
Your spark (jvm) application cannot read environment variable if you don't tell it to, so a quick work around :
spark.sparkContext
.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", awsAccessKeyId)
spark.sparkContext
.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", awsSecretAccessKey)
You'll also need to precise the s3 endpoint :
spark.sparkContext
.hadoopConfiguration.set("fs.s3a.endpoint", "<<ENDPOINT>>");
To know more about what is AWS S3 Endpoint, refer to the following documentation :
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.