简体   繁体   中英

In which deployment mode can we “Not” add nodes to a cluster in Apache Spark 2.3.1

In which deployment mode can we Not add Nodes/workers to a cluster in Apache Spark 2.3.1

1.Spark Standalone 2.Mesos 3.Kubernetes 4.Yarn 5.Local Mode

i have installed Apache Spark 2.3.1 on my machine and have run it in Local Mode

in Local Mode can we add Nodes/workers to Apache Spark?

When master is Local, your program will run on single machine that is your edge node. To run it in distributed environment ie on cluster you need to select master as "Yarn".

When deployment mode is "client" (default) your edge node will become the master (where driver program will run). When deployment mode is "cluster", any of the healthy node from the cluster becomes master

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM