简体   繁体   English

az 突触火花作业提交

[英]az synapse spark job submit

According to the documentation, using az synapse spark job submit, I can pass arguments using --arguments.根据文档,使用 az synapse spark job submit,我可以使用 --arguments 传递 arguments。 So far so good.到目前为止,一切都很好。
However, I cannot figure out to actually access those arguments in my code.但是,我无法弄清楚在我的代码中实际访问那些 arguments 。 Here's my current effort:这是我目前的努力:

val conf = new SparkConf().setAppName("foo")
val sc = new SparkContext(conf)
val spark = SparkSession.builder.appName("foo").getOrCreate()
val start_time = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm").format(LocalDateTime.now)
val appID = sc.getConf.getAppId


//let's get some arguments
val inputArgs = spark.sqlContext.getConf("spark.driver.args").split("\\s+")
//val inputArgs = sc.getConf.get("spark.driver.args").split("\\s+") 

Either of those lines throw the following exception:这些行中的任何一行都抛出以下异常:

22/03/25 19:07:45 ERROR ApplicationMaster: User class threw exception: java.util.NoSuchElementException: spark.driver.args
java.util.NoSuchElementException: spark.driver.args

So, how do I read the arguments in the Scala code?那么,如何读取Scala代码中的arguments呢?

Ok, I was overcomplicating this.好吧,我把这个复杂化了。

def main(args: Array[String]) {
...

val foo = args(0)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure Synapse:在Spark作业参考文件中上传py文件的目录 - Azure Synapse: Upload directory of py files in Spark job reference files Azure Synapse:Spark 作业定义中指定的目标 Spark 池未成功 state。当前 state:配置 - Azure Synapse: Target Spark pool specified in Spark job definition is not in succeeded state. Current state: Provisioning EMR 中的问题添加了提交 Spark 作业的步骤 - Issues in EMR add step to submit a spark job 我想在 dataproc 集群上提交一个带有自定义作业 ID 的 spark 作业 - I want to submit a spark job on the dataproc cluster, with custom job id 写入 Synapse DWH 池时出现 Spark 错误 - Spark errors when writing to Synapse DWH pool spark.write.synapsesql 选项与 Azure Synapse Spark Pool - spark.write.synapsesql options with Azure Synapse Spark Pool 无法安装 Python Wheel Package 到 Azure Synapse Apache Spark Pool - Unable to Install Python Wheel Package to Azure Synapse Apache Spark Pool azure 突触中 spark notebook 管道中的文件路径错误 - File path error in pipeline for spark notebook in azure synapse 在 azure 突触笔记本中使用 spark.sql 提取 json 列 - extract json column using spark.sql in azure synapse notebook 将 spark dataframe 从 Databricks 写入 Azure Synapse 时出错 - Error when write spark dataframe from Databricks into Azure Synapse
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM