简体   繁体   English

Spark Local与Cluster

[英]Spark Local vs Cluster

What is the Spark cluster equivalent of standalone's local[N]. 什么是Spark集群等效于独立的local [N]。 I mean, the value we set as a parameter of local as N, which parameter takes it in the cluster mode? 我的意思是,我们设置为local的参数的值为N,在集群模式下哪个参数接受它?

In local[N] - N is the maximum number of cores can be used in a node at any point of time. local[N] N是节点在任何时间点可以使用的最大内核数。

In cluster mode you can set --executor-cores N . 在集群模式下,您可以设置--executor-cores N It means that each executor can run a maximum of N tasks at the same time in an executor. 这意味着每个执行者在一个执行者中最多可以同时运行N任务。

In cluster mode, one executor will run on one worker node, which means that one executor will takes all the cores on the worker node. 集群模式下,一个执行程序将在一个工作程序节点上运行,这意味着一个执行程序将占用该工作程序节点上的所有内核。 It could result in under-utilization of the resources. 这可能会导致资源利用不足。 Keep in mind, driver will also takes one worker node. 请记住, 驱动程序也将占用一个工作节点。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM