[英]Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStrea
I'm pretty new on Cloudera Quick-start so sorry if my explanation will be not so clear. 我在Cloudera快速入门上还很新,如果对我的解释不太清楚,请见谅。 Anyway I'm writing a code in Java which read File from Hdfs. 无论如何,我正在用Java编写一个从Hdfs读取File的代码。 I build a Maven-Project and I set up all the dependencies in the pom.xml, but when I try to launch the jar from shell (java -jar jnameofthefile.jar) I'm getting this error: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStrea 我构建了一个Maven项目,并在pom.xml中设置了所有依赖项,但是当我尝试从外壳程序(java -jar jnameofthefile.jar)启动jar时,出现此错误:线程“ main”中的异常java.lang.NoClassDefFoundError:org / apache / hadoop / fs / FSDataInputStrea
This is my Java code: 这是我的Java代码:
package com.hdfs_java_api;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import java.io.InputStream;
import java.io.IOException;
import java.net.URI;
public class HadoopFileSystemCat {
public static void main(String [] args) throws IOException
{
String uri = "hdfs://quickstart.cloudera:8020/user/hive/warehouse/Orders.csv";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
InputStream in = null;
try {
in = fs.open(new Path(uri));
IOUtils.copyBytes(in, System.out, 4096, false);
}finally{
IOUtils.closeStream(in);
}
}
}
And this is my pom.xml: 这是我的pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com</groupId>
<artifactId>cards</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>hdfs_java_api</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-
8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0-cdh5.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-yarn-common</artifactId>
<version>2.6.0-cdh5.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-common</artifactId>
<version>2.6.0-cdh5.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.0-cdh5.13.0</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>cloudera</id>
<name>cloudera</name>
<url>https://repository.cloudera.com/artifactory/cloudera-
repos/</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.hdfs_java_api.HadoopFileSystemCat</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
As i said I'm noob so be patient and try to be as much clear as possible, thank you in advance! 正如我所说的,我是菜鸟,请耐心等待,并尽量保持清晰,谢谢您!
I think you are missing core library 我认为您缺少核心库
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>0.20.2</version>
</dependency>
After that make sure, you have included "Maven Dependencies" in build path. 之后,请确保在构建路径中包括了“ Maven Dependencies”。
And in Deployment Assembly - 在部署程序集中-
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.