简体   繁体   English

使用Spark / Scal应用访问s3时找不到AWS凭证

[英]AWS credentials not found when using spark/scal app to access s3

I'm using a windows environment, hadoop is not installed, I have a hadoop folder with bin and the winutils, that is all. 我正在使用Windows环境,未安装hadoop,我有一个包含bin和winutils的hadoop文件夹,仅此而已。

I've set environment variables for both the id and secret as per the docs but constantly getting this exception: 我已经按照文档为id和secret设置了环境变量,但是不断收到此异常:

Exception in thread "main" java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3n URL, or by setting the fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey properti

any idea how I can fix please? 知道我该如何解决吗?

I've confirmed the variables are set by echoing them but they're not getting picked up - thanks./ 我已经确认变量是通过回显它们来设置的,但它们不会被拾取-谢谢。/

Have you tried this configuration? 您是否尝试过这种配置?

val sc = new SparkContext(conf)
val hadoopConf = sc.hadoopConfiguration;
hadoopConf.set("fs.s3.impl", "org.apache.hadoop.fs.s3native.NativeS3FileSystem")
hadoopConf.set("fs.s3.awsAccessKeyId",myAccessKey)
hadoopConf.set("fs.s3.awsSecretAccessKey",mySecretKey)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM