简体   繁体   中英

How to set spark to use a specific number of cores?

I have 3 machines, each machine with 12 cores. How I can set spark to use the 12 cores?

In the spark-env.sh Im already setting the memory, but I dont find how to set the number of cores. Can you give a help?

export SPARK_WORKER_MEMORY=28G

Add following in your spark-env.sh on all machines

export SPARK_WORKER_CORES=12

SPARK_WORKER_CORES specifies total number of cores to allow Spark applications to use on the machine (default: all available cores).

In addition if you want two workers on single machine then try following :

export SPARK_WORKER_INSTANCES=2
export SPARK_WORKER_CORES=6

This will start two workers with 6 core each.

Check http://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts for more details.

In your application you can sparkConfig.set("spark.executor.cores", value). This refers to cores per-executor.

If you're using spark-submit you can also use options like --total-executor-cores , or --executor-cores . If you're using yarn, you can also select dynamicAllocation.

http://spark.apache.org/docs/latest/submitting-applications.html http://spark.apache.org/docs/latest/configuration.html

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM