简体   繁体   中英

Spark-submit in cluster mode makes the application view unusable?

I use Jenkins to run a spark-submit every hour with --deploy-mode cluster set. Is approach recommended? Because some things don't seem to work.

When I go to my spark server, and press the Application name, it tries to go to my computer on that port (which is firewalled) instead of the server. With --deploy-mode cluster I'd expect to be able to see this on the spark master since the spark-submit returns immediately in this mode.

Also, I can't seem to open the DAG. Is that related?

The full command I use is spark-submit --master spark://mysparkserver:6066 --deploy-mode cluster --class someClass --conf spark.driver.userClassPathFirst=true http://theJarUrl

Found it! Despite the fact that you can enable event logging on spark master, you also need to specify it in spark-submit using --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=/opt/spark/spark-events . After you can watch the events on the spark-master by clicking the application name. It will also enable inspecting finished applications.

Despite that the DAG visualisation still doesn't work for me. If anyone has a suggestion for that, please leave a comment.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM