[英]Out of memory error Error while building spark
I am building spark by using sbt.我正在使用 sbt 构建火花。 When I run the following command:
当我运行以下命令时:
sbt/sbt assembly
it takes some time to build spark.建立火花需要一些时间。 There are several warning that appear and at the end I am getting following errors:
出现了几个警告,最后我收到以下错误:
[error] java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space
[error] Use 'last' for the full log.
When I check sbt version using command sbt sbtVersion , I get following result:当我使用命令sbt sbtVersion检查 sbt 版本时,我得到以下结果:
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.1 -> 0.6.2
[warn] * com.typesafe.sbt:sbt-site:0.7.0 -> 0.7.1
.......
[info] streaming-zeromq/*:sbtVersion
[info] 0.13.7
[info] repl/*:sbtVersion
[info] 0.13.7
[info] spark/*:sbtVersion
[info] 0.13.7
When I give command, ./bin/spark-shell , I get following output:当我发出命令./bin/spark-shell 时,我得到以下输出:
ls: cannot access '/home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10': No such file or directory
Failed to find Spark assembly in /home/neel_shah/spark/spark-1.6.1/assembly/target/scala-2.10.
You need to build Spark before running this program.
What can the solution be?解决办法是什么?
You must configure SBT heap size:您必须配置 SBT 堆大小:
export SBT_OPTS="-Xmx2G"
to set it temporaryexport SBT_OPTS="-Xmx2G"
将其设置为临时~/.bash_profile
and add line export SBT_OPTS="-Xmx2G"
~/.bash_profile
并添加 line export SBT_OPTS="-Xmx2G"
set JAVA_OPTS=-Xmx2G
to set it temporaryset JAVA_OPTS=-Xmx2G
以将其设置为临时sbt\\conf\\sbtconfig.txt
and set -Xmx2G
sbt\\conf\\sbtconfig.txt
并设置-Xmx2G
More info:更多信息:
http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Setup.html
This is probably not a common resolution, but in my case I had to run this command to resolve the OutOfMemoryError when building a spark project with sbt (path is specific to mac OS):这可能不是一个常见的解决方案,但就我而言,在使用 sbt 构建 spark 项目时,我必须运行此命令来解决 OutOfMemoryError(路径特定于 mac OS):
rm -rf /Users/markus.braasch/Library/Caches/Coursier/v1/https/
Increasing memory settings for a variety of arguments in SBT_OPTS did not solve it.在 SBT_OPTS 中增加各种参数的内存设置并没有解决它。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.