简体   繁体   English

单Windows机器上的多个Spark Worker

[英]Multiple Spark Workers on Single Windows Machine

I am trying to teach myself Spark through Scala using Intellij on Windows. 我正在尝试在Windows上使用Intellij通过Scala自学Spark。 I'm doing this on a single machine, and I would like to start multiple workers on the single machine to simulate a cluster. 我正在单台计算机上执行此操作,并且我想在单台计算机上启动多个工作程序来模拟集群。 I read this page where it says that 我在此页上读到说

"The launch scripts do not currently support Windows. To run a Spark cluster on Windows, start the master and workers by hand." “启动脚本当前不支持Windows。要在Windows上运行Spark集群,请手动启动master和worker。”

I don't know what it means to start the masters and workers by hand. 我不知道手工启动主人和工人意味着什么。 Could anyone help? 有人可以帮忙吗? Many thanks for any help/suggestions. 非常感谢您的帮助/建议。

To manually start Spark Master, run below command from %SPARK_HOME%\\bin 要手动启动Spark Master,请从%SPARK_HOME%\\ bin下面的命令中运行

spark-class org.apache.spark.deploy.master.Master

Above command will also print master URL like spark://ip:port 上面的命令还将打印主URL,例如spark:// ip:port
Master UI can be accessed at localhost:8080 可以在localhost:8080上访问主UI

To start Spark Worker, run 要启动Spark Worker,运行

spark-class org.apache.spark.deploy.worker.Worker spark://ip:port

Now if you refresh Master UI, you can see the new worker listed under Workers section. 现在,如果刷新主用户界面,则可以在“工作者”部分下看到新的工作者。
Repeat the command to add multiple workers to the same master. 重复该命令以将多个工作器添加到同一主服务器。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM