[英]How can I run my spark app with python in the background with my computer shut off?
I always get this error message when I add deploy mode: 添加部署模式时,我总是收到以下错误消息:
Error: Cluster deploy mode is currently not supported for python applications on standalone clusters. 错误:独立群集上的python应用程序当前不支持群集部署模式。
dse spark-submit --master spark://localhost:7077 \
--deploy-mode cluster --executor-memory 4G --total-executor-cores 2 \
--driver-memory 1G \
--packages org.apache.spark:spark-streaming-kafka_2.10:1.4.1 \
--jars /root/spark-streaming-kafka_2.10-1.4.1.jar \
/root/pythonspark/com/spark/articles_axes.py weibo_article weibohao
You can for example submit your application using detached GNU Screen session started on the same machine where you run Spark master: 例如,您可以使用在运行Spark master的同一台计算机上启动的分离GNU Screen会话来提交应用程序:
screen -dmS spark dse spark-submit --master spark://localhost:7077 \
--executor-memory 4G --total-executor-cores 2 \
...
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.