简体   繁体   中英

K8s Spark Job JAR params

I am using below manifest and while applying i am getting below error, is it a correct way to pass JAR arguments?

apiVersion: batch/v1
kind: Job
metadata:
  name: spark-on-eks
spec:
  template:
    spec:
      containers:
        - name: spark
          image: repo:buildversion
          command: [
            "/bin/sh",
            "-c",
            "/opt/spark/bin/spark-submit \
            --master k8s://EKSEndpoint \
            --deploy-mode cluster \
            --name spark-luluapp \
            --class com.ll.jsonclass \
            --conf spark.jars.ivy=/tmp/.ivy \
            --conf spark.kubernetes.container.image=repo:buildversion \
            --conf spark.kubernetes.namespace=spark-pi \
            --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
            --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
            --conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
            --conf spark.kubernetes.driver.pod.name=spark-job-driver \
            --conf spark.executor.instances=4 \
            local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
            [mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] " 
          ]
      serviceAccountName: spark-pi
      restartPolicy: Never
  backoffLimit: 4

error converting YAML to JSON: yaml: line 33: did not find expected ',' or ']'

your format of yaml is wrong. try this one.

apiVersion: batch/v1
kind: Job
metadata:
  name: spark-on-eks
spec:
  template:
    spec:
      containers:
        - name: spark
          image: repo:buildversion
          command:  
            - "/bin/sh"
            - "-c"
            - '/opt/spark/bin/spark-submit \
            --master k8s://EKSEndpoint \
            --deploy-mode cluster \
            --name spark-luluapp \
            --class com.ll.jsonclass \
            --conf spark.jars.ivy=/tmp/.ivy \
            --conf spark.kubernetes.container.image=repo:buildversion \
            --conf spark.kubernetes.namespace=spark-pi \
            --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
            --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
            --conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
            --conf spark.kubernetes.driver.pod.name=spark-job-driver \
            --conf spark.executor.instances=4 \
            local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
            [mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] '

      serviceAccountName: spark-pi
      restartPolicy: Never
  backoffLimit: 4

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM