[英]Kill Spark Job programmatically
I am running pyspark application through Jupyter notebook.我正在通过 Jupyter notebook 运行 pyspark 应用程序。 I can kill a job using Spark Web UI, but I want to kill it programmatically.
我可以使用 Spark Web UI 终止工作,但我想以编程方式终止它。
How can I kill it ???怎么杀啊???
To expand on @Netanel Malka's answer, you can use the cancelAllJobs method to cancel every running job, or one can use the cancelJobGroup method to cancel jobs that have been organized into a group.要扩展@Netanel Malka 的答案,您可以使用 cancelAllJobs 方法取消每个正在运行的作业,或者可以使用 cancelJobGroup 方法取消已组织为一组的作业。
From the PySpark documentation:从 PySpark 文档:
cancelAllJobs()
Cancel all jobs that have been scheduled or are running.
cancelJobGroup(groupId)
Cancel active jobs for the specified group. See SparkContext.setJobGroup for more information.
And an example from the docs:以及文档中的一个示例:
import threading
from time import sleep
result = "Not Set"
lock = threading.Lock()
def map_func(x):
sleep(100)
raise Exception("Task should have been cancelled")
def start_job(x):
global result
try:
sc.setJobGroup("job_to_cancel", "some description")
result = sc.parallelize(range(x)).map(map_func).collect()
except Exception as e:
result = "Cancelled"
lock.release()
def stop_job():
sleep(5)
sc.cancelJobGroup("job_to_cancel")
suppress = lock.acquire()
suppress = threading.Thread(target=start_job, args=(10,)).start()
suppress = threading.Thread(target=stop_job).start()
suppress = lock.acquire()
print(result)
Suppose that you wrote this code:假设您编写了以下代码:
from pyspark import SparkContext
sc = SparkContext("local", "Simple App")
# This will stop your app
sc.stop()
As descibes in the docs: http://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=stop#pyspark.SparkContext.stop作为文档中的描述: http ://spark.apache.org/docs/latest/api/python/pyspark.html?highlight=stop#pyspark.SparkContext.stop
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.