简体   繁体   English

构建Apache Spark SQL核心

[英]Building Apache Spark SQL Core

I am trying to build Apache Spark SQL Core (1.4.1) and I get the following stack trace. 我正在尝试构建Apache Spark SQL Core(1.4.1),并且得到以下堆栈跟踪。 But if I build the whole Spark Project everything's going well and the building finishes successfully. 但是,如果我构建了整个Spark项目,那么一切将会顺利进行,并且构建成功完成。 Any ideas? 有任何想法吗?

The stack trace 堆栈跟踪

[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:258: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil
[error]         SparkHadoopUtil.get.globPathIfNecessary(qualified)
[error]                             ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/DataFrameReader.scala:263: value map is not a member of Array[Nothing]
[error]           globbedPaths.map(_.toString), None, None, extraOptions.toMap)(sqlContext))
[error]                        ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/MonotonicallyIncreasingID.scala:22: object Nondeterministic is not a member of package org.apache.spark.sql.catalyst.expressions
[error] import org.apache.spark.sql.catalyst.expressions.{Nondeterministic, LeafExpression}
[error]        ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/MonotonicallyIncreasingID.scala:36: not found: type Nondeterministic
[error] private[sql] case class MonotonicallyIncreasingID() extends LeafExpression with Nondeterministic {
[error]                                                                                 ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/SparkPartitionID.scala:22: object Nondeterministic is not a member of package org.apache.spark.sql.catalyst.expressions
[error] import org.apache.spark.sql.catalyst.expressions.{Nondeterministic, LeafExpression}
[error]        ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/execution/expressions/SparkPartitionID.scala:30: not found: type Nondeterministic
[error] private[sql] case object SparkPartitionID extends LeafExpression with Nondeterministic {
[error]                                                                       ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/sources/ddl.scala:252: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil
[error]             SparkHadoopUtil.get.globPathIfNecessary(qualifiedPattern).map(_.toString).toArray
[error]                                 ^
[error] /home/ubuntu/Dev/spark/sql/core/src/main/scala/org/apache/spark/sql/sources/ddl.scala:279: value globPathIfNecessary is not a member of org.apache.spark.deploy.SparkHadoopUtil
[error]             SparkHadoopUtil.get.globPathIfNecessary(qualifiedPattern).map(_.toString).toArray
[error]                                 ^
[error] 8 errors found
[debug] Compilation failed (CompilerInterface)
[error] Compile failed at Jul 21, 2015 5:57:38 AM [29.605s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 38.435s
[INFO] Finished at: Tue Jul 21 05:57:38 UTC 2015
[INFO] Final Memory: 37M/609M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-sql_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed. CompileFailed -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile (scala-compile-first) on project spark-sql_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
    at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
    at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
    at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
    at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
    at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:compile failed.
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:110)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
    ... 19 more
Caused by: Compile failed via zinc server
    at sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136)
    at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86)
    at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)
    at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)
    at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)
    at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
    ... 20 more
[ERROR] 
[ERROR] 

Well there's no globPathIfNecessary in SparkHadoopUtil so it must be your own modification. 那么有没有在globPathIfNecessary SparkHadoopUtil所以它必须是你自己的修改。 When you run the build from the top level, the Maven reactor has visibility to the whole project and can see your changes. 从顶层运行构建时,Maven反应器可以看到整个项目,并且可以看到您的更改。 When you run the build from the subproject, Maven looks for everything from outside of the subproject in your local repo, so it can't see any modifications unless you've installed them. 当从子项目运行构建时,Maven在本地存储库中寻找子项目之外的所有内容,因此除非已安装它们,否则它看不到任何修改。 So run your build again from the top level but do an install instead of a package to get your modifications installed into your local repo. 因此,请从顶层再次运行构建,但要执行install而不是install package以将您的修改安装到本地存储库中。 Once you do that, executing a build from sql/core should be able to resolve your changes successfully. 完成此操作后,从sql / core执行构建应能够成功解决您的更改。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM