[英]Spark 1.3.1 install failed in MLlib when I run make-distribution.sh in Ubuntu 14.04
Spark 1.3.1 install failed in MLlib when I run make-distribution.sh in Ubuntu 14.04 当我在Ubuntu 14.04中运行make-distribution.sh时,MLlib中的Spark 1.3.1安装失败
java version "1.7.0_80" Java(TM) SE Runtime Environment (build 1.7.0_80-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
Java java version "1.7.0_80" Java(TM) SE Runtime Environment (build 1.7.0_80-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
: java version "1.7.0_80" Java(TM) SE Runtime Environment (build 1.7.0_80-b15) Java HotSpot(TM) 64-Bit Server VM (build 24.80-b11, mixed mode)
Scala code runner version 2.10.4 -- Copyright 2002-2013, LAMP/EPFL
Scala Scala code runner version 2.10.4 -- Copyright 2002-2013, LAMP/EPFL
: Scala code runner version 2.10.4 -- Copyright 2002-2013, LAMP/EPFL
`
INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project ML Library 1.3.2-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for net.sf.opencsv:opencsv:jar:2.3 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more detail
s
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-mllib_2.10 ---
[INFO] Deleting /home/tongz/project/spark/spark/mllib/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ spark-mllib_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.0:add-source (eclipse-add-source) @ spark-mllib_2.10 ---
[INFO] Add Source directory: /home/tongz/project/spark/spark/mllib/src/main/scala
[INFO] Add Test Source directory: /home/tongz/project/spark/spark/mllib/src/test/scala
[INFO]
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ spark-mllib_2.10 ---
[INFO] Source directory: /home/tongz/project/spark/spark/mllib/src/main/scala added.
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ spark-mllib_2.10 ---
[WARNING] Invalid POM for net.sf.opencsv:opencsv:jar:2.3, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] Invalid project model for artifact [opencsv:net.sf.opencsv:2.3]. It will be ignored by the remote resources Mojo.
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-mllib_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 26 resources
[INFO] Copying 3 resources
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ spark-mllib_2.10 ---
[INFO] Using zinc server for incremental compilation
[INFO] compiler plugin: BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[info] Compiling 144 Scala sources and 2 Java sources to /home/tongz/project/spark/spark/mllib/target/scala-2.10/classes...
[error] error while loading , error in opening zip file
[error] object scala.runtime in compiler mirror not found.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [4.145s]
[INFO] Spark Project Networking .......................... SUCCESS [11.811s]
[INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [6.064s]
[INFO] Spark Project Core ................................ SUCCESS [2:39.458s]
[INFO] Spark Project Bagel ............................... SUCCESS [5.837s]
[INFO] Spark Project GraphX .............................. SUCCESS [17.580s]
[INFO] Spark Project Streaming ........................... SUCCESS [30.898s]
[INFO] Spark Project Catalyst ............................ SUCCESS [34.868s]
[INFO] Spark Project SQL ................................. SUCCESS [41.695s]
[INFO] Spark Project ML Library .......................... FAILURE [0.522s]
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Flume Sink ................. SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] Spark Project External Kafka Assembly ............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5:13.600s
[INFO] Finished at: Sun May 03 21:23:26 EDT 2015
[INFO] Final Memory: 41M/499M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-mllib_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :spark-mllib_2.10
`
This is the last few line of error message, if you need I can provide you more. 这是错误消息的最后几行,如果需要,我可以为您提供更多信息。
Thanks in advance! 提前致谢!
Ok wait for 12 hours still no answer! 确定等待12个小时仍然没有答案! I dig a lot I think I found the answer myself here are the trick:
我做了很多事情,我想我自己找到了答案,这就是窍门: sbt clean clean-files rm -rf ~/.ivy2 ~/.m2 ~/.sbt
sbt clean clean-files rm -rf ~/.ivy2 ~/.m2 ~/.sbt
sbt clean clean-files rm -rf ~/.ivy2 ~/.m2 ~/.sbt
These 2 lines are the problem [error] error while loading , error in opening zip file [error] object scala.runtime in compiler mirror not found.
From what I understand I have some scalar or mvn package broken before, it causes this error, I have to remove them. 据我了解,我之前曾破坏过一些标量或mvn软件包,这会导致此错误,因此我必须将其删除。 Also it may also because of sbt was old that's why I did clean that. 也可能是因为sbt太旧了,所以我才清理了它。
PS: if you wanna find what packages are broken do follow cli find ~/.ivy2 ~/.m2 ~/.sbt -name "*.jar" -exec unzip -qqt {} \\; PS:如果您想查找损坏的软件包,请遵循cli find〜/ .ivy2〜/ .m2〜/ .sbt -name“ * .jar” -exec解压缩-qqt {} \\;
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.