简体   繁体   English

独立应用程序中的Spark Java错误

[英]Spark Java error in standalone application

I was able to create an executable using launch4j and it works fine on my machine. 我能够使用launch4j创建可执行文件,并且在我的机器上可以正常工作。 When I send it to someone to run on their windows machine they get the following error: 当我将其发送给某人在其Windows计算机上运行时,他们收到以下错误:

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: spark/TemplateEngine
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Unknown Source)
        at java.lang.Class.privateGetMethodRecursive(Unknown Source)
        at java.lang.Class.getMethod0(Unknown Source)
        at java.lang.Class.getMethod(Unknown Source)
        at sun.launcher.LauncherHelper.validateMainClass(Unknown Source)
        at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)
Caused by: java.lang.ClassNotFoundException: spark.TemplateEngine
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        ... 7 more

Any thoughts? 有什么想法吗?

I had this problem because my version of java that was as default was 9, and somehow Spark didn't recognize it. 我遇到这个问题是因为我的默认Java版本是9,Spark无法识别它。 So I changed to version 8 and it worked. 因此,我更改为版本8并成功了。 To change in linux : 要在linux中更改:

sudo update-java-alternatives -s java-1.8.0-openjdk-amd64

In your case you may want another version, so choose yours (to list the versions you have in your computer use the -l option). 在您的情况下,您可能需要另一个版本,因此选择您的版本(使用-l选项列出您计算机中的版本)。

必须在类路径中设置相对路径,以便可执行文件可以找到jar文件

This exception may also occur if your maven dependencies do not have <scope> set to compile time (default) and dependency jars are not available at compile time. 如果您的Maven依赖项未将<scope>设置为编译时(默认),并且依赖项jar在编译时不可用,则也可能发生此异常。 For instance if maven dependencies have <scope>provided</scope> in your pom.xml compiler will assume that the JRE/environment will provide these dependency jars but when the sources are compiled and these dependencies are not found, this exception will be thrown. 例如,如果您的pom.xml编译器中已<scope>provided</scope> maven依赖项<scope>provided</scope> ,则编译器将假定JRE /环境将提供这些依赖项jar,但是当编译源并且未找到这些依赖项时,将抛出此异常。

For example - below may lead to this exception if spark-mllib_2.11 dependency is not found during compile time although they are added and there is no error during editing; 例如-如果在编译期间未找到spark-mllib_2.11依赖关系(尽管已添加它们并且在编辑过程中没有错误),则可能导致以下异常:

<dependency>
   <groupId>org.apache.spark</groupId>
   <artifactId>spark-mllib_2.11</artifactId>
   <version>2.2.0</version>
   <scope>provided</scope>
</dependency>

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM