简体   繁体   中英

EMR - Pyspark error -Container exited with a non-zero exit code 13. Error file: prelaunch.err

I am trying to execute a hello world like program in pyspark. I have created an EMR cluster thru boto3 and have added the step to execute my code.

Step is:

            'Name': 'Run Step',
            'ActionOnFailure': 'CONTINUE',
            'HadoopJarStep': {
                'Args': [
                    'spark-submit',
                    '--master', 'yarn',
                    '--deploy-mode', 'cluster',
                  #  '--py-files',
                    's3://bucket/s3csvload.py'
                ],
                'Jar': 'command-runner.jar'
            }
        }

The code I am trying to execute is

rom pyspark.sql import *
from pyspark.sql.types import *
from pyspark.sql.functions import *
from pyspark.sql import SparkSession

spark = SparkSession.builder.master('yarn').appName('DIF1').getOrCreate()

Input_schema1 = StructType([StructField("sepal_length", DecimalType(), True),
                            StructField("sepal_width", DecimalType(), True),
                            StructField("petal_length", DecimalType(), True),
                            StructField("petal_width", DecimalType(), True),
                            StructField("species", StringType(), True)])

lookup_df = spark.read \
              .option("header", "true") \
              .option("inferSchema", "true") \
              .option("schema", Input_schema1) \
              .csv("s3://bucket/iris.csv")

lookup_df.write.csv("s3://bucket/Target")

The error I am facing is:

20/12/22 15:27:29 INFO Client: 
     client token: N/A
     diagnostics: Application application_1xxxx0_0003 failed 2 times due to AM Container for appattempt_16xxxxx10_0003_000002 exited with  exitCode: 13
Failing this attempt.Diagnostics: [2020-12-22 15:27:28.643]Exception from container-launch.
Container id: container_16xxxx10_0003_02_000001
Exit code: 13

[2020-12-22 15:27:28.644]Container exited with a non-zero exit code 13. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :

I have tried related links. Not much helpful.

I guess I have to change something in spark session builder. But, not sure. Any helps appreciated. Thank you.

Following changes in my code solved the issue:

Steps=[
        {
            'Name': 'Run Step',
            'ActionOnFailure': 'CONTINUE',
            'HadoopJarStep': {
                'Jar': 'command-runner.jar',
                'Args': ['sudo',
                         'spark-submit',
                         '--master', 'yarn',
                         #        '--conf','spark.yarn.submit.waitAppCompletion=true'
                         '--deploy-mode', 'cluster',
                         '--py-files', 's3a://bucket/pgm.py', 's3a://bucket/pgm.py'
                         ]

            }
        }
    ]
spark = SparkSession.builder.appName('DIF1').getOrCreate()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM