简体   繁体   中英

DSE spark cluster on AWS worker and Executor ports

I am trying to setup a 6 node DSE 5.1 spark cluster on AWS EC2 machines. I have referred the DSE documentation just to start with, I have opened all TCP ports , when I checked the logs, I found that worker process and executor process and driver process are using below ports

33xxx
33xxx
33xxx
34xxx
34xxx
34xxx
35xxx
35xxx
35xxx
36xxx
37xxx
37xxx
39xxx
40xxx
40xxx
41xxx
41xxx
43xxx
43xxx
43xxx
43xxx
45xxx
46xxx

the range here is from 33xxx to 46xxx, what is suggested range to open the ports ?, or is there any way to bind worker and executor ports ?

By default the port selection is random

See the Spark Docs

Specifically

spark.blockManager.port
spark.driver.port

While you can lock these down to a specific value by setting them in the SparkConf or on the CLI through Spark Submit, you need to make sure that every application has unique values so they do not collide.

In most cases it makes sense to keep the Driver in the same VPN as the Cluster.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM