简体   繁体   English

Apache Spark:如何在Spark独立模式下完成应用程序级调度?

[英]Apache Spark: How application level scheduling is done in spark standalone mode?

StandAlone Mode 单机模式

There is not so much information available on "Application level Scheduling" ie how multiple applications are scheduled in spark standalone mode. “应用程序级别计划”上没有太多可用信息,即如何在Spark独立模式下计划多个应用程序。 Only this link http://spark.apache.org/docs/latest/job-scheduling.html . 仅此链接http://spark.apache.org/docs/latest/job-scheduling.html The only thing that I have been able to find is that its a FIFO scheduler. 我唯一能找到的就是它的FIFO调度程序。 As I understand standalone scheduling can also be done in static or dynamic mode. 据我了解,独立调度也可以在静态或动态模式下完成。 But I could not find any info about dynamic mode. 但是我找不到有关动态模式的任何信息。 Can someone explain the application scheduling in spark standalone mode in detail ? 有人可以详细解释Spark独立模式下的应用程序调度吗?

info about dynamic mode. 有关动态模式的信息。

This is when you dynamic configure the resources your application is going to use. 这是在动态配置应用程序将要使用的资源时。

For example, you can configure the number of cores, the memory per executor, number of executors and so on. 例如,您可以配置内核数,每个执行程序的内存,执行程序的数量等等。

Read more in Dynamic Resource Allocation . 动态资源分配中了解更多信息。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM