简体   繁体   中英

ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200

So I am new to spark. My versions are: Spark 2.1.2, Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131). I am using IntellijIdea 2018 Community on Windows 10 (x64). And whenever I am trying to run a simple word count example I get the following error:

18/10/22 01:43:14 ERROR SparkContext: Error initializing SparkContext. java.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration. at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216) at org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:330) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174) at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257) at org.apache.spark.SparkContext.(SparkContext.scala:432) at WordCount$.main(WordCount.scala:5) at WordCount.main(WordCount.scala)

PS: this is the code of the wordcounter that use as an example:

import org.apache.spark.{SparkConf,SparkContext}
object WordCount {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("mySpark").setMaster("local")
val sc = new SparkContext(conf)
val rdd = sc.textFile(args(0))
val wordcount = rdd.flatMap(_.split("\t") ).map((_, 1))
  .reduceByKey(_ + _)
for (arg <- wordcount.collect())
  print(arg + " ")
println()
//    wordcount.saveAsTextFile(args(1))
//    wordcount.saveAsTextFile("myFile")
sc.stop()
}
}

So my question is how to get rid of this error. I have searched for the solution and tried installing different versions of Spark and JDK and Hadoop, but it didn't help. I don't know where may be the problem.

If you are in IntelliJ you may struggle a lot, What I did and it worked is that I have initialized SparkContext before SparkSession by doing

  1. val conf:SparkConf = new SparkConf().setAppName("name").setMaster("local") .set("spark.testing.memory", "2147480000")

  2. val sc:SparkContext = new SparkContext(conf)

There is maybe a better solution, because here I actually don't need to initialise SparkContext since it is implicitly done by initialising SparkSession.

Go to settings - run/debug configurations -> and for VM options put

-Xms128m -Xms512m -XX:MaxPermSize=300m -ea

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM