简体   繁体   中英

How to Stop running Spark Streaming application Gracefully?

How Do i stop spark streaming? My spark streaming job is running continuously. I want to stop in a graceful manner.

I have seen below option to shutdown streaming application.

sparkConf.set("spark.streaming.stopGracefullyOnShutdown","true") 

Spark configuration: available properties

But, how do i update this parameter on a running application?

Have a look at this blogpost . It it the "nicest" way to gracefully terminate a streaming job I have come across.

How to pass Shutdown Signal :

Now we know how to ensure graceful shutdown in spark streaming. But how can we pass the shutdown signal to spark streaming. One naive option is to use CTRL+C command at the screen terminal where we run driver program but obviously its not a good option. One solution , which i am using is , grep the driver process of spark streaming and send a SIGTERM signal . When driver gets this signal, it initiates the graceful shutdown of the application. We can write the command as below in some shell script and run the script to pass shutdown signal :

ps -ef | grep spark | grep | awk '{print $2}' | xargs kill -SIGTERM

eg ps -ef | grep spark | grep DataPipelineStreamDriver | awk '{print $2}' | xargs kill -SIGTERM

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM