简体   繁体   中英

What is the best way to restart spark streaming application?

I basically want to write an event callback in my driver program which will restart the spark streaming application on arrival of that event. My driver program is setting up the streams and the execution logic by reading configurations from a file. Whenever the file is changed (new configs added) the driver program has to do the following steps in a sequence,

  1. Restart,
  2. Read the config file (as part of the main method) and
  3. Set up the streams

What is the best way to achieve this?

In some cases you may want to reload streaming context dynamically (for example to reloading of streaming operations). In that cases you may ( Scala example):

val sparkContext = new SparkContext()

val stopEvent = false
var streamingContext = Option.empty[StreamingContext]
val shouldReload = false

val processThread = new Thread {
  override def run(): Unit = {
    while (!stopEvent){
      if (streamingContext.isEmpty) {

        // new context
        streamingContext = Option(new StreamingContext(sparkContext, Seconds(1)))

        // create DStreams
          val lines = streamingContext.socketTextStream(...)

        // your transformations and actions
        // and decision to reload streaming context
        // ...

        streamingContext.get.start()
      } else {
        if (shouldReload) {
          streamingContext.get.stop(stopSparkContext = false, stopGracefully = true)
          streamingContext.get.awaitTermination()
          streamingContext = Option.empty[StreamingContext]
        } else {
          Thread.sleep(1000)
        }
      }

    }
    streamingContext.get.stop(stopSparkContext =true, stopGracefully = true)
    streamingContext.get.awaitTermination()
  }
}

// and start it  in separate thread
processThread.start()
processThread.join()

or in python :

spark_context = SparkContext()

stop_event = Event()
spark_streaming_context = None
should_reload = False

def process(self):
    while not stop_event.is_set():
        if spark_streaming_context is None:

            # new context
            spark_streaming_context = StreamingContext(spark_context, 0.5)

            # create DStreams
            lines = spark_streaming_context.socketTextStream(...)  

            # your transformations and actions
            # and decision to reload streaming context
            # ...

            self.spark_streaming_context.start()
        else:
            # TODO move to config
            if should_reload:
                spark_streaming_context.stop(stopSparkContext=False, stopGraceFully=True)
                spark_streaming_context.awaitTermination()
                spark_streaming_context = None
            else:
                time.sleep(1)
    else:
        self.spark_streaming_context.stop(stopGraceFully=True)
        self.spark_streaming_context.awaitTermination()


# and start it  in separate thread
process_thread = threading.Thread(target=process)
process_thread.start()
process_thread.join()

If you want to prevent you code from crashes and restart streaming context from the last place use checkpointing mechanism. It allow you to restore your job state after failure.

The best way to Restart the Spark is actually according to your environment.But it is always suggestible to use spark-submit console.

You can background the spark-submit process like any other linux process, by putting it into the background in the shell . In your case, the spark-submit job actually then runs the driver on YARN , so, it's baby-sitting a process that's already running asynchronously on another machine via YARN .

Cloudera blog

One way that we explored recently (in a spark meetup here) was to achieve this by using Zookeeper in Tandem with Spark. This in a nutshell uses Apache Curator to watch for changes on Zookeeper (changes in config of ZK this can be triggered by your external event) that then causes a listener to restart.

The referenced code base is here , you will find that a change in config causes the Watcher (a spark streaming app) to reboot after a graceful shutdown and reload changes. Hope this is a pointer!

I am currently solving this issue as follows,

  • Listen to external events by subscribing to a MQTT topic

  • In the MQTT callback, stop the streaming context ssc.stop(true,true) which will gracefully shutdown the streams and underlying spark config

  • Start the spark application again by creating a spark conf and setting up the streams by reading the config file

// Contents of startSparkApplication() method sparkConf = new SparkConf().setAppName("SparkAppName") ssc = new StreamingContext(sparkConf, Seconds(1)) val myStream = MQTTUtils.createStream(ssc,...) //provide other options myStream.print() ssc.start()

The application is built as Spring boot application

In Scala, stopping sparkStreamingContext may involve stopping SparkContext . I have found that when a receiver hangs, it is best to restart the SparkCintext and the SparkStreamingContext.

I am sure the code below can be written much more elegantly, but it allows for the restarting of SparkContext and SparkStreamingContext programatically. Once this is done, you can restart your receivers programatically as well.

    package coname.utilobjects

import com.typesafe.config.ConfigFactory
import grizzled.slf4j.Logging
import coname.conameMLException
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import org.apache.spark.streaming.{Seconds, StreamingContext}

import scala.collection.mutable


object SparkConfProviderWithStreaming extends Logging
{
  val sparkVariables: mutable.HashMap[String, Any] = new mutable.HashMap
}



trait SparkConfProviderWithStreaming extends Logging{






  private val keySSC = "SSC"
  private val keyConf = "conf"
  private val keySparkSession = "spark"


  lazy val   packagesversion=ConfigFactory.load("streaming").getString("streaming.cassandraconfig.packagesversion")
  lazy val   sparkcassandraconnectionhost=ConfigFactory.load("streaming").getString("streaming.cassandraconfig.sparkcassandraconnectionhost")
  lazy val   sparkdrivermaxResultSize=ConfigFactory.load("streaming").getString("streaming.cassandraconfig.sparkdrivermaxResultSize")
  lazy val   sparknetworktimeout=ConfigFactory.load("streaming").getString("streaming.cassandraconfig.sparknetworktimeout")


  @throws(classOf[conameMLException])
  def intitializeSpark(): Unit =
  {
    getSparkConf()
    getSparkStreamingContext()
    getSparkSession()
  }

  @throws(classOf[conameMLException])
  def getSparkConf(): SparkConf = {
    try {
      if (!SparkConfProviderWithStreaming.sparkVariables.get(keyConf).isDefined) {
        logger.info("\n\nLoading new conf\n\n")
        val conf = new SparkConf().setMaster("local[4]").setAppName("MLPCURLModelGenerationDataStream")
        conf.set("spark.streaming.stopGracefullyOnShutdown", "true")
        conf.set("spark.cassandra.connection.host", sparkcassandraconnectionhost)
        conf.set("spark.driver.maxResultSize", sparkdrivermaxResultSize)
        conf.set("spark.network.timeout", sparknetworktimeout)


        SparkConfProviderWithStreaming.sparkVariables.put(keyConf, conf)
        logger.info("Loaded new conf")
        getSparkConf()
      }
      else {
        logger.info("Returning initialized conf")
        SparkConfProviderWithStreaming.sparkVariables.get(keyConf).get.asInstanceOf[SparkConf]
      }
    }
    catch {
      case e: Exception =>
        logger.error(e.getMessage, e)
        throw new conameMLException(e.getMessage)
    }

  }

  @throws(classOf[conameMLException])
def killSparkStreamingContext
  {
    try
    {
      if(SparkConfProviderWithStreaming.sparkVariables.get(keySSC).isDefined)
        {
          SparkConfProviderWithStreaming.sparkVariables -= keySSC
          SparkConfProviderWithStreaming.sparkVariables -= keyConf
        }
      SparkSession.clearActiveSession()
      SparkSession.clearDefaultSession()

    }
    catch {
      case e: Exception =>
        logger.error(e.getMessage, e)
        throw new conameMLException(e.getMessage)
    }
  }

  @throws(classOf[conameMLException])
  def getSparkStreamingContext(): StreamingContext = {
    try {
      if (!SparkConfProviderWithStreaming.sparkVariables.get(keySSC).isDefined) {
        logger.info("\n\nLoading new streaming\n\n")
        SparkConfProviderWithStreaming.sparkVariables.put(keySSC, new StreamingContext(getSparkConf(), Seconds(6)))

        logger.info("Loaded streaming")
        getSparkStreamingContext()
      }
      else {
        SparkConfProviderWithStreaming.sparkVariables.get(keySSC).get.asInstanceOf[StreamingContext]
      }
    }
    catch {
      case e: Exception =>
        logger.error(e.getMessage, e)
        throw new conameMLException(e.getMessage)
    }
  }

  def getSparkSession():SparkSession=
  {

    if(!SparkSession.getActiveSession.isDefined)
    {
      SparkSession.builder.config(getSparkConf()).getOrCreate()

    }
    else
      {
        SparkSession.getActiveSession.get
      }
  }

}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM