簡體   English   中英

K8s Spark Job JAR 參數

[英]K8s Spark Job JAR params

我正在使用以下清單,並且在應用時遇到錯誤,這是通過 JAR arguments 的正確方法嗎?

apiVersion: batch/v1
kind: Job
metadata:
  name: spark-on-eks
spec:
  template:
    spec:
      containers:
        - name: spark
          image: repo:buildversion
          command: [
            "/bin/sh",
            "-c",
            "/opt/spark/bin/spark-submit \
            --master k8s://EKSEndpoint \
            --deploy-mode cluster \
            --name spark-luluapp \
            --class com.ll.jsonclass \
            --conf spark.jars.ivy=/tmp/.ivy \
            --conf spark.kubernetes.container.image=repo:buildversion \
            --conf spark.kubernetes.namespace=spark-pi \
            --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
            --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
            --conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
            --conf spark.kubernetes.driver.pod.name=spark-job-driver \
            --conf spark.executor.instances=4 \
            local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
            [mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] " 
          ]
      serviceAccountName: spark-pi
      restartPolicy: Never
  backoffLimit: 4

將 YAML 轉換為 JSON 時出錯:yaml:第 33 行:未找到預期的“,”或“]”

您的 yaml 格式錯誤。 試試這個。

apiVersion: batch/v1
kind: Job
metadata:
  name: spark-on-eks
spec:
  template:
    spec:
      containers:
        - name: spark
          image: repo:buildversion
          command:  
            - "/bin/sh"
            - "-c"
            - '/opt/spark/bin/spark-submit \
            --master k8s://EKSEndpoint \
            --deploy-mode cluster \
            --name spark-luluapp \
            --class com.ll.jsonclass \
            --conf spark.jars.ivy=/tmp/.ivy \
            --conf spark.kubernetes.container.image=repo:buildversion \
            --conf spark.kubernetes.namespace=spark-pi \
            --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
            --conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
            --conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
            --conf spark.kubernetes.driver.pod.name=spark-job-driver \
            --conf spark.executor.instances=4 \
            local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
            [mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] '

      serviceAccountName: spark-pi
      restartPolicy: Never
  backoffLimit: 4

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM