简体   繁体   中英

How to pass --properties-file to spark-submit in Qubole?

I am using Spark in Qubole by having the clusters created in AWS. In Qubole Workbench , when I execute the below Command Line, it works fine and the command is successful

/usr/lib/spark/bin/spark-submit s3://bucket-name/SparkScripts/test.py

But, when I execute the same command along with --properties-file option

/usr/lib/spark/bin/spark-submit --properties-file s3://bucket-name/SparkScripts/properties.file s3://bucket-name/SparkScripts/test.py

it gives below error message

Qubole > Shell Command failed with exit code: 1

App > Error occurred when getting effective config required to initialize Qubole security provider

App > Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Properties file s3:/bucket-name/SparkScripts/properties.file does not exist

Can someone help me fix this? I need some application properties to be stored on a separate file on Amazon S3 and passed on to --properties-file to my spark program.

@saravanan - Qubole does not have the ability to specify --properties file from S3 path currently. It will be available in release 59.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM