简体   繁体   中英

Spark Worker - Change web ui host in standalone mode

When I view the master node's web ui, it shows all my current workers attached to the cluster.

https://spark.apache.org/docs/3.0.0-preview/web-ui.html

The issue that I am having though is that the IP address it uses for the worker nodes in the web ui is incorrect. Is there a way to change the worker's web ui host/ip that is used in the master's web ui?

Reading through the documentation, there appears to be "SPARK_WORKER_WEBUI_PORT" which sets the port for the worker but there doesn't seem to be a "SPARK_WORKER_WEBUI_HOST".

http://spark.apache.org/docs/latest/spark-standalone.html

To provide more context, I currently have a spark cluster that is deployed in stand alone mode. The spark cluster (master and slaves) are all behind a router (NAT). The workers bind to the master using their internal IP address. I setup port forwarding to route external traffic to each of the master and slaves. The issue is that since my workers are binding to the master using their internal IP addresses, that it uses the internal IP address in the master node's web ui. This makes the worker node's web ui inaccessible for everyone outside of my NAT. If there is a way to specifically set the IP address to use for each of my worker's web ui, then this would resolve this problem. Thanks!

After more research, I determined that the environment variable I was looking for was: SPARK_PUBLIC_DNS

http://spark.apache.org/docs/latest/spark-standalone.html

This allowed me to set a different external host name for my workers.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM