[英]Can't load imported class with .jar
I currently have a java-based maven project which when run, spins up jetty server and load a web app to the localhost:4567. 我目前有一个基于Java的Maven项目,该项目在运行时启动码头服务器并将Web应用程序加载到localhost:4567。 I want to deploy this app to a server on a CentOS virtual machine but I'm not sure where to start.
我想将此应用程序部署到CentOS虚拟机上的服务器,但是我不确定从哪里开始。 CentOS currently has an Apache server running, an has maven, DB, and other dependencies are installed.
CentOS当前正在运行Apache服务器,并已安装maven,DB和其他依赖项。
I packaged the project as a .jar but I'm currently unable to run the .jar file. 我将项目打包为.jar,但目前无法运行.jar文件。 I get an Exception "java.lang.NoClassDefFoundError: spark/Route" when I run the jar file via command prompt or IDE.
通过命令提示符或IDE运行jar文件时,出现异常“ java.lang.NoClassDefFoundError:spark / Route”。
pom.xml looks like this: pom.xml看起来像这样:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.labfinder</groupId>
<artifactId>labfinder</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>LabFinder</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<repositories>
<repository>
<id>Spark repository</id>
<url>http://www.http://sparkjava.com/nexus/content/repositories/spark/</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<!-- Build an executable JAR -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>com.labfinder.Main</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongodb-driver</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.3</version>
</dependency>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.23</version>
</dependency>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.1</version>
</dependency>
</dependencies>
As far as I know I shouldn't have to put Spark Java into my class-path. 据我所知,我不必将Spark Java放入类路径中。 Any one know whats going on here?
有人知道这是怎么回事吗?
You should use the maven-assembly-plugin to build a single executable jar with dependencies included, also known as fat jar. 您应该使用maven-assembly-plugin构建包含相关性的单个可执行jar,也称为fat jar。
Instead of the maven-jar-plugin, try using the following in your pom.xml: 尝试在pom.xml中使用以下内容代替maven-jar-plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<finalName>${project.artifactId}</finalName>
<attach>false</attach>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>com.labfinder.Main</mainClass>
</manifest>
</archive>
</configuration>
</execution>
</executions>
</plugin>
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.