简体   繁体   English

Spark Executor:初始堆大小无效:-Xms0M

[英]Spark Executor : Invalid initial heap size: -Xms0M

I have configured Spark to query on hive table. 我已经配置Spark来查询hive表。

Run the Thrift JDBC/ODBC server using below command : 使用以下命令运行Thrift JDBC / ODBC服务器:

cd $SPARK_HOME
./sbin/start-thriftserver.sh --master spark://myhost:7077 --hiveconf hive.server2.thrift.bind.host=myhost --hiveconf hive.server2.thrift.port=9999

Then checked at Spark worker UI , executor startup failing with below error , JVM initialization failing because of wrong -Xms : 然后检查了Spark worker UI,执行器启动失败并出现以下错误,JVM初始化失败,因为错误的-Xms:

Invalid initial heap size: -Xms0M
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.

Following are changed configurations in conf/spark-env.sh 以下是conf / spark-env.sh中更改的配置

export SPARK_JAVA_OPTS="-Dspark.executor.memory=512M"
export SPARK_EXECUTOR_MEMORY=1G
export SPARK_DRIVER_MEMORY=512M
export SPARK_WORKER_MEMORY=2G
export SPARK_WORKER_INSTANCES=1

I really don't have clue from where this value -Xms0M is coming or how it has been derived ? 我真的不知道这个值-Xms0M的来源或者它是如何派生出来的? Please help me understand issue and change this value. 请帮我理解问题并更改此值。

It working now ... 它现在正在工作......

Thrift server is not picking executor memory from spark-env.sh​ , then I added in thrift server startup script explicitly. Thrift服务器没有从spark-env.sh中选择执行程序内存,然后我明确地在thrift服务器启动脚本中添加了。

./sbin/start-thriftserver.sh ./sbin/start-thriftserver.sh

exec "$FWDIR"/sbin/spark-daemon.sh spark-submit $CLASS 1 --executor-memory 512M "$@"

With this , Executor start getting valid memory and JDBC queries are getting results. 有了这个,Executor开始获得有效的内存,JDBC查询正在获得结果。

conf/spark-env.sh​ (executor memory configurations not picked by thrift-server) conf / spark-env.sh (执行器内存配置未被thrift-server选中)

export SPARK_JAVA_OPTS="-Dspark.executor.memory=512M"
export SPARK_EXECUTOR_MEMORY=512M

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM