[英]K8s Spark Job JAR params
I am using below manifest and while applying i am getting below error, is it a correct way to pass JAR arguments?我正在使用以下清单,并且在应用时遇到错误,这是通过 JAR arguments 的正确方法吗?
apiVersion: batch/v1
kind: Job
metadata:
name: spark-on-eks
spec:
template:
spec:
containers:
- name: spark
image: repo:buildversion
command: [
"/bin/sh",
"-c",
"/opt/spark/bin/spark-submit \
--master k8s://EKSEndpoint \
--deploy-mode cluster \
--name spark-luluapp \
--class com.ll.jsonclass \
--conf spark.jars.ivy=/tmp/.ivy \
--conf spark.kubernetes.container.image=repo:buildversion \
--conf spark.kubernetes.namespace=spark-pi \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
--conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
--conf spark.kubernetes.driver.pod.name=spark-job-driver \
--conf spark.executor.instances=4 \
local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
[mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] "
]
serviceAccountName: spark-pi
restartPolicy: Never
backoffLimit: 4
error converting YAML to JSON: yaml: line 33: did not find expected ',' or ']'将 YAML 转换为 JSON 时出错:yaml:第 33 行:未找到预期的“,”或“]”
your format of yaml is wrong.您的 yaml 格式错误。 try this one.
试试这个。
apiVersion: batch/v1
kind: Job
metadata:
name: spark-on-eks
spec:
template:
spec:
containers:
- name: spark
image: repo:buildversion
command:
- "/bin/sh"
- "-c"
- '/opt/spark/bin/spark-submit \
--master k8s://EKSEndpoint \
--deploy-mode cluster \
--name spark-luluapp \
--class com.ll.jsonclass \
--conf spark.jars.ivy=/tmp/.ivy \
--conf spark.kubernetes.container.image=repo:buildversion \
--conf spark.kubernetes.namespace=spark-pi \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
--conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
--conf spark.kubernetes.driver.pod.name=spark-job-driver \
--conf spark.executor.instances=4 \
local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
[mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] '
serviceAccountName: spark-pi
restartPolicy: Never
backoffLimit: 4
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.