[英]NoClassDefFoundError caused by ClassNotFoundException in Hadoop on Hive Driver Connection line?
Typically I start by Googling for a solution, but this error does not seem to have occurred before. 通常我从谷歌搜索开始寻找解决方案,但此错误似乎以前没有发生过。
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/Shell
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.findHadoopBinary(HiveConf.java:906)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:237)
at org.apache.hive.jdbc.HiveConnection.isHttpTransportMode(HiveConnection.java:221)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:138)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:123)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at com.merck.ghh.ingestion.HiveTableSetup.tableSetup(HiveTableSetup.java:31)
at com.merck.ghh.ingestion.HiveTableSetup.main(HiveTableSetup.java:546)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Shell
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 10 more
The following code causes this error, specifically on the Connection declaration line: 以下代码导致此错误,特别是在连接声明行上:
try {
Class.forName(driverName);
Connection connection = DriverManager.getConnection("jdbc:hive2://localhost:10000/default", "hive", "");
Statement statement = connection.createStatement();
} catch (ClassNotFoundException exception) {
exception.printStackTrace();
}
Other questions seem to point to this being related to not including hadoop-core in the dependencies, but hadoop-core does not seem to exist in the Hadoop 2.X. 其他问题似乎指出这与依赖中不包括hadoop-core有关,但Hadoop 2.X中似乎不存在hadoop-core。 I am specifically working in Hadoop 2.1.0.2.0.5.0-67 with Hive 0.12.0.
我特意在Hadoop 2.1.0.2.0.5.0-67中使用Hive 0.12.0。 Before this error I was having issues bc my Hive dependencies had gotten destroyed at some point, but I put those back in and this error happened next.
在此错误之前,我遇到了问题,因为我的Hive依赖项已经在某些时候被破坏了,但我把它们放回去了,接下来发生了这个错误。 I'm wondering if it's not something as simple as forgetting a dependency, but I'm not finding which dependency I might be missing.
我想知道它是不是像忘记依赖一样简单,但我找不到可能缺少的依赖项。
Any help with this is greatly appreciated. 非常感谢任何帮助。
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.Shell
As you are using Hadoop 2.X.
当您使用
Hadoop 2.X.
you need to download hadoop-common-2.1.0-beta.jar OR 你需要下载hadoop-common-2.1.0-beta.jar或者
hadoop-common-2.0.0-cdh4.4.0.jar and to class path to avoid ClassNotFoundException
. hadoop-common-2.0.0-cdh4.4.0.jar和类路径以避免
ClassNotFoundException
。
You are right earlier hadoop-core-0.20.2-737.jar was used. 你是对的早先使用了hadoop-core-0.20.2-737.jar 。
For more visit http://grepcode.com/ 欲了解更多信息,请访问http://grepcode.com/
Try copying the JDBC connector jar into the $HIVE_HOME/lib/ folder. 尝试将JDBC连接器jar复制到$ HIVE_HOME / lib /文件夹中。 I faced a similar issue and copying the jar file worked for me.
我遇到了类似的问题,复制jar文件对我有用。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.