[英]How to add offline Apache Spark dependencies to Java application?
I want to get started with Apache Spark in Java. 我想开始使用Java中的Apache Spark。 So I started following this tutorial .
因此,我开始关注本教程 。 In the tutorial, they have shown a simple desktop app written in Eclipse IDE and compiled and run using the Gradle plugin for Eclipse.
在本教程中,他们展示了一个用Eclipse IDE编写的简单桌面应用程序,并使用用于Eclipse的Gradle插件进行编译和运行。
I followed this tutorial, and everything worked perfectly. 我遵循了本教程,一切正常。 Here is their
build.gradle
file: 这是他们的
build.gradle
文件:
apply plugin: 'java-library'
repositories {
jcenter()
}
dependencies {
compileOnly 'org.apache.spark:spark-core_2.11:2.1.0'
testImplementation 'org.apache.spark:spark-core_2.11:2.1.0','junit:junit:4.12'
}
So far so good! 到现在为止还挺好! BUT now my real need was to be able to compile and run Spark applications in an offline system.
但是,现在我真正需要的是能够在脱机系统中编译和运行Spark应用程序。 So what I did was that I searched for
org.apache.spark:spark-core_2.11:2.1.0
and junit:junit:4.12
in jcenter
and downloaded all the .jar
files and then created a folder in my project root named libs
and then added all those .jar
files to my project's Build Path by following this method . 因此,我所做的是,我搜索了
org.apache.spark:spark-core_2.11:2.1.0
和junit:junit:4.12
在jcenter
并下载所有.jar
文件,然后在我的项目名为根创建的文件夹libs
然后按照以下方法将所有这些.jar
文件添加到我项目的Build Path中。
And I modified the build.gradle
file like this: 我修改了
build.gradle
文件,如下所示:
apply plugin: 'java'
apply plugin: 'application'
//apply plugin: 'java-library'
repositories {
jcenter()
}
dependencies {
compile fileTree(dir: 'libs', include: '*.jar')
}
When I compile this modified application, I get the following errors in the console. 编译此修改后的应用程序时,在控制台中出现以下错误。 The question is that what am I doing wrong?
问题是我在做什么错? How to I get rid of the problem?
如何摆脱这个问题?
Working Directory: /home/jason/WorkspaceOfGetStartedWithSparkJava/FirstApacheSparkProjectWithOfflineDependencies
Gradle User Home: /usr/share/gradle/bin
Gradle Distribution: Gradle wrapper from target build
Gradle Version: 4.3
Java Home: /usr/lib/jvm/java-8-oracle
JVM Arguments: None
Program Arguments: None
Build Scans Enabled: false
Offline Mode Enabled: false
Gradle Tasks: build
:compileJava/home/jason/WorkspaceOfGetStartedWithSparkJava/FirstApacheSparkProjectWithOfflineDependencies/src/main/java/main_package/SparkDriverProgram.java:21: error: cannot access Cloneable
conf.setAppName("Schneider");
^
class file for scala.Cloneable not found
/home/jason/WorkspaceOfGetStartedWithSparkJava/FirstApacheSparkProjectWithOfflineDependencies/src/main/java/main_package/SparkDriverProgram.java:22: error: cannot access Serializable
conf.setMaster("local");
^
class file for scala.Serializable not found
2 errors
FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileJava'.
> Compilation failed; see the compiler error output for details.
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1s
1 actionable task: 1 executed
SparkConf implements the interfaces scala.Cloneable and scala.Serializable both of which are not part of org.apache.spark:spark-core but of org.apache.spark:scala-library. SparkConf实现了接口scala.Cloneable和scala.Serializable,它们都不是org.apache.spark:spark-core的一部分,而是org.apache.spark:scala-library的一部分。 If you add that jar to your project the same way your did with the core jar those two errors should disappear.
如果您以与使用核心jar相同的方式将该jar添加到项目中,则这两个错误应会消失。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.