简体   繁体   English

使用linux“ at”功能运行docker命令

[英]run docker commands with linux “at” function

I am running a PySpark script that runs a while loop for 2 hours. 我正在运行一个运行2小时的while循环的PySpark脚本。 I'd love to be able to schedule this to run overnight. 我希望能够安排它运行一整夜。

I type in these two lines in order to run the code. 我键入这两行以运行代码。 After ssh into the server I type: 在ssh进入服务器后,我输入:

docker exec -it r-hadoop-spark-yarn-spark-master-1-085ae410 bash

This takes me to a working directory in Docker, I then type: 这将我带到Docker中的工作目录,然后输入:

nohup /opt/spark-2.0.2/bin/spark-submit --master spark://hadoop-spark-yarn-spark-master-1:7077 --executor-memory 8G bm_plog_hdfs.py &> SparkPLOG.out &

to run the script (if all goes well). 运行脚本(如果一切顺利)。

I have tried inserting these two lines, without the 'bash' into a script and scheduling this with 'at' but it doesn't appear to run. 我尝试将没有'bash'的这两行插入脚本,并以'at'进行调度,但它似乎没有运行。

通过运行以下命令,使用-d尝试分离模式:

  docker exec -d r-hadoop-spark-yarn-spark-master-1-085ae410 bash -c "nohup /opt/spark-2.0.2/bin/spark-submit --master spark://hadoop-spark-yarn-spark-master-1:7077 --executor-memory 8G bm_plog_hdfs.py &> SparkPLOG.out &"

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM