简体   繁体   English

多个火花上下文错误

[英]More than one spark context error

I have this spark code below: 我在下面有这个火花代码:

    import org.apache.hadoop.hbase.client._
    import org.apache.hadoop.hbase.{ HBaseConfiguration, HTableDescriptor }
    import org.apache.hadoop.hbase.mapreduce.TableInputFormat
    import org.apache.hadoop.hbase.io.ImmutableBytesWritable
    import org.apache.hadoop.hbase.util.Bytes

    import kafka.serializer.StringDecoder

    import org.apache.spark._
    import org.apache.spark.SparkContext._
    import org.apache.spark.streaming._
    import org.apache.spark.streaming.kafka._

object Hbase {
    def main(args: Array[String]) {
          val sparkConf = new SparkConf().setAppName("Spark-Hbase").setMaster("local[2]")
          val sc = new SparkContext(sparkConf)

          ...

          val ssc = new StreamingContext(sparkConf, Seconds(3))
          val kafkaBrokers = Map("metadata.broker.list" -> "localhost:9092")
          val topics = List("test").toSet
          val lines = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaBrokers, topics)
    }
}

Now the error I am getting is: 现在我得到的错误是:

Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true.

Is there anything wrong with my code above? 我上面的代码有什么问题吗? I do not see where I am creating the context again... 我看不到要在哪里再次创建上下文...

These are the two SparkContext you're creating. 这是您要创建的两个SparkContext。 This is not allowed. 这是不允许的。

val sc = new SparkContext(sparkConf)
val ssc = new StreamingContext(sparkConf, Seconds(3))

You should create the streaming context from the original context. 您应该从原始上下文创建流上下文。

val ssc = new StreamingContext(sc, Seconds(3))

you are initializing two spark context in the same JVM ie (sparkContext and streamingContext). 您正在同一JVM中初始化两个spark上下文,即(sparkContext和streamingContext)。 That's why you are getting this exception. 这就是为什么您遇到此异常的原因。 you can set spark.driver.allowMultipleContexts = true in config. 您可以在配置中设置spark.driver.allowMultipleContexts = true。 Although, multiple Spark contexts is discouraged. 尽管不建议使用多个Spark上下文。 You can get unexpected results. 您会得到意想不到的结果。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM