简体   繁体   中英

How can I run my spark app with python in the background with my computer shut off?

I always get this error message when I add deploy mode:

Error: Cluster deploy mode is currently not supported for python applications on standalone clusters.

dse spark-submit --master spark://localhost:7077 \
  --deploy-mode cluster --executor-memory 4G --total-executor-cores 2 \
  --driver-memory 1G \
  --packages org.apache.spark:spark-streaming-kafka_2.10:1.4.1 \
   --jars /root/spark-streaming-kafka_2.10-1.4.1.jar \
   /root/pythonspark/com/spark/articles_axes.py weibo_article weibohao

You can for example submit your application using detached GNU Screen session started on the same machine where you run Spark master:

screen -dmS spark dse spark-submit --master spark://localhost:7077 \
  --executor-memory 4G --total-executor-cores 2 \
  ...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM