简体   繁体   中英

Invalid initial heap size

This is the log of the error message I get when I deploy a spark job on an EKS cluster. Using Spark 2.4.2 embedded into a Docker image Alpine 3.9.

Do you have any idea about it?

Thanks

++ id -u
+ myuid=0
++ id -g
+ mygid=0
++ getent passwd 0
+ uidentry=root:x:0:0:root:/root:/bin/ash
+ '[' -z root:x:0:0:root:/root:/bin/ash ']'
+ SPARK_K8S_CMD=driver
+ '[' -z driver ']'
+ shift 1
+ SPARK_CLASSPATH=':/opt/spark/jars/*'
+ env
+ grep SPARK_JAVA_OPT_
+ sort -t_ -k4 -n
+ sed 's/[^=]*=\(.*\)/\1/g'
+ readarray -t SPARK_JAVA_OPTS
+ '[' -n '' ']'
+ '[' -n '' ']'
+ case "$SPARK_K8S_CMD" in
+ CMD=(${JAVA_HOME}/bin/java "${SPARK_JAVA_OPTS[@]}" -cp "$SPARK_CLASSPATH" -Xms$SPARK_DRIVER_MEMORY -Xmx$SPARK_DRIVER_MEMORY -Dspark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS $SPARK_DRIVER_CLASS $SPARK_DRIVER_ARGS)
+ exec /sbin/tini -s -- /usr/lib/jvm/java-1.8-openjdk/bin/java -cp ':/opt/spark/jars/*' -Xms -Xmx -Dspark.driver.bindAddress=xxx.xxx.xxx.xxx
Invalid initial heap size: -Xms
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.

You are not setting any value to the min/initial heap (- Xms ) neither to the max heap (- Xmx ).

Here:

(...) ':/opt/spark/jars/*' -Xms -Xmx -Dspark.driver.bindAddress=xxx.xxx.xxx.xxx (...)

The error is telling you that the value of the Xms param is not correct: Invalid initial heap size: -Xms ( as there's no value ).

Fill those params with valid values, and try again. You could set it in MB (-Xms512M, -Xmx1024M ) or GB ( -Xms1G, -Xmx2G ). Just some examples, check your host and JVM to specify the correct values.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM