简体   繁体   English

Spark Worker和Executors核心

[英]Spark Worker and Executors Cores

I have a Spark Cluster running in YARN mode on top of HDFS. 我在HDFS上有一个以YARN模式运行的Spark群集。 I launched one worker with 2 cores and 2g of memory. 我启动了一个具有2个内核和2g内存的工作程序。 Then I submitted a job with dynamic configuration of 1 executor with 3 cores. 然后,我提交了具有1个具有3个核心的执行程序的动态配置的作业。 Still, my job is able to run. 不过,我的工作仍然可以执行。 Can somebody explain the difference between the number of cores with which the worker is launched and the ones requested for the executors. 有人可以解释一下启动该工作程序的内核数量与为执行程序请求的内核数量之间的区别。 My understanding was since the executors run inside the workers they cannot acquire more resources than those available for the worker. 我的理解是,由于执行者在工人内部运行,因此他们获得的资源无法超过工人可用的资源。

Check for parameter yarn.nodemanager.resource.cpu-vcores in yarn-site.xml. 检查yarn-site.xml中的参数yarn.nodemanager.resource.cpu-vcores。

yarn.nodemanager.resource.cpu-vcores controls the maximum sum of cores used by the containers on each node. yarn.nodemanager.resource.cpu-vcores控制每个节点上的容器使用的最大核心总数。

->Spark launches n number of executors inside worker nodes. ->火花在工作节点内启动n个执行程序。 ->Spark uses number of cores and executor-memory parameter for launching executors at time of application submit to spark cluster. -> Spark在提交到Spark集群时使用内核数和executor-memory参数来启动执行程序。 ->In spark submit we can not specify number of cores for a worker node. ->在spark提交中,我们无法为工作节点指定核心数。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Apache Spark:核心与执行者 - Apache Spark: cores vs. executors Spark与执行者和核心数量的合并关系 - Spark coalesce relationship with number of executors and cores Spark - 为我的spark作业分配了多少个执行器和内核 - Spark - How many Executors and Cores are allocated to my spark job 给定内核和执行器的数量,如何确定rdd中partd的数量? - How to determine number of partitons of rdd in spark given the number of cores and executors ? 在 spark 中,我可以定义比可用内核更多的执行程序吗? - In spark, can I define more executors than available cores? Apache Spark:内核数与执行程序数 - Apache Spark: The number of cores vs. the number of executors SPARK Partitions 和 Worker Cores 有什么区别? - What is the difference between SPARK Partitions and Worker Cores? 如果工作节点上未安装Spark(在YARN上),则如何启动Spark Executors? - How are Spark Executors launched if Spark (on YARN) is not installed on the worker nodes? 使用Spark独立集群如何在工作节点上管理多个执行者? - How multiple executors are managed on the worker nodes with a Spark standalone cluster? 为什么Cloudera建议选择他们在Spark中执行的执行器,核心和RAM的数量 - Why does cloudera recommend choosing the number of executors, cores, and RAM they do in Spark
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM