简体   繁体   English

缺少 Hive 执行 Jar:/usr/local/hadoop/hive/lib/hive-exec-*.jar

[英]Missing Hive Execution Jar: /usr/local/hadoop/hive/lib/hive-exec-*.jar

I have Hadoop 1.0.4 running on a single node cluster set up on my Ubuntu machine.我在我的 Ubuntu 机器上设置的单节点集群上运行 Hadoop 1.0.4。

I did the following steps to download and install the hive release我做了以下步骤来下载和安装 hive 版本

> svn co http://svn.apache.org/repos/asf/hive/trunk hive    
> cd hive

My hadoop install folder is $HADOOP_HOME/hive and $HADOOP_HOME=/usr/local/hadoop.我的 hadoop 安装文件夹是 $HADOOP_HOME/hive 和 $HADOOP_HOME=/usr/local/hadoop。 I set both the environment variables in .bashrc under /home/hduser which is my Hadoop user and the $JAVA_HOME as well.我在/home/hduser 下的.bashrc 中设置了两个环境变量,这是我的Hadoop 用户和$JAVA_HOME。

export HADOOP_HOME=/usr/local/hadoop

export HIVE_HOME=$HADOOP_HOME/hive

export JAVA_HOME=/usr/lib/jvm/java-6-openjdk

I have also added the bin folder of both hadoop and hive in my $PATH variable as follows我还在 $PATH 变量中添加了 hadoop 和 hive 的 bin 文件夹,如下所示

export PATH=$PATH:$HADOOP_HOME/bin    
export PATH=$HIVE_HOME/bin:$PATH

But while running hive from CLI, I am getting the below error.但是在从 CLI 运行 hive 时,我收到以下错误。

hduser@somnath-laptop:/usr/local/hadoop/hive$ bin/hive    
Missing Hive Execution Jar: /usr/local/hadoop/hive/lib/hive-exec-*.jar

Should I download this jar and add it to the /lib/ or is there some hive-specific environment variables that I need to configure.我应该下载这个 jar 并将其添加到 /lib/ 中,还是需要配置一些特定于 hive 的环境变量。 Any suggestion would be very helpful.任何建议都会非常有帮助。

I resolved the problem myself but not sure what exactly happened. 我自己解决了问题,但不确定到底发生了什么。

By following the process I mentioned in my original question, I created the $HADOOP_HOME/hive but it was giving me a missing jar error. 通过遵循我在原始问题中提到的过程,我创建了$ HADOOP_HOME / hive,但它给了我一个缺少的jar错误。

So, what I did was: I downloaded hive-0.10.0.tar.gz and extracted it under $HADOOP_HOME. 因此,我所做的是:我下载了hive-0.10.0.tar.gz并将其提取到$ HADOOP_HOME下。 So the newly created folder was $HADOOP_HOME/hive-0.10.0. 因此,新创建的文件夹为$ HADOOP_HOME / hive-0.10.0。

I copied the entire lot of jars under $HADOOP_HOME/hive-0.10.0/lib to $HADOOP_HOME/hive/lib and when I executed next, 我将$ HADOOP_HOME / hive-0.10.0 / lib下的所有jar都复制到$ HADOOP_HOME / hive / lib,然后在下次执行时,

$HADOOP_HOME/hive> bin/hive $ HADOOP_HOME / hive> bin /蜂巢

It worked! 有效! Please note my $HIVE_HOME=$HADOOP_HOME/hive and $HIVE_HOME/bin is added to path. 请注意,我的$ HIVE_HOME = $ HADOOP_HOME / hive和$ HIVE_HOME / bin已添加到路径。 Hope this helps somebody facing similar problem. 希望这可以帮助面临类似问题的人。

以我为例,在设置PATH变量后,进行一次简单的重新引导很有帮助。

i have the same issue,and i use the command "source ~/.bashrc" 我有同样的问题,我使用命令“ source〜/ .bashrc”

problem resolved! 问题解决了!

Try this : 尝试这个 :

export HIVE_HOME=$HADOOP_HOME/hive/build/dist
export PATH=$HIVE_HOME/bin:$PATH

The tar file apache-hive-0.13.1-src.tar.gz has a missing lib folder You can download hive-0.12.0.tar.gz and move the lib folder to apache-hive-0.13.1-src folder. tar文件apache-hive-0.13.1-src.tar.gz缺少lib文件夹您可以下载hive-0.12.0.tar.gz并将lib文件夹移至apache-hive-0.13.1-src文件夹。 Now hive should be working. 现在,蜂巢应该可以工作了。

Just find your hive-exec-*.jar folder and create a symbolic link to it. 只要找到您的hive-exec-*。jar文件夹并创建一个符号链接即可。

in my case, first i go to hive folder using "cd /usr/local/Cellar/hive/1.2.1" 就我而言,首先我使用“ cd /usr/local/Cellar/hive/1.2.1”转到配置单元文件夹

and then run command "ln -s libexec/lib/ lib" 然后运行命令“ ln -s libexec / lib / lib”

Just want to post what worked for me (in 2017). 只想发布对我有用的内容(2017年)。

Using Spark 2.0.2, I had to change my $HIVE_HOME variable (which in my case, constituted me just removing the variable from my .bash_profile . 使用Spark 2.0.2,我必须更改$HIVE_HOME变量(在我的情况下,这只是将我的.bash_profile的变量删除了。

Hope this helps someone else. 希望这对其他人有帮助。

Here is another post for what worked for me in 2017. 这是2017年对我有用的另一篇文章。

This issue happened to me because of the way I (a beginner) extracted the Hive tar file. 由于我(初学者)提取Hive tar文件的方式而发生此问题。 I downloaded "hive-2.3.0" from us.apache.org and extracted the file to /usr/local/hive. 我从us.apache.org下载了“ hive-2.3.0”, 并将文件提取到/ usr / local / hive。 The jar was expected to be in /usr/local/hive/lib but for some reasons it was in /usr/local/hive**/bin/**lib. 该罐子应该位于/ usr / local / hive / lib中,但由于某些原因,它位于/ usr / local / hive ** / bin / ** lib中。 In other words, there was an extra "/bin" directory under /hive which contained all of the files that should have been directly under /hive. 换句话说,在/ hive下有一个额外的“ / bin”目录,其中包含应该直接在/ hive下的所有文件。 I fixed this problem by renaming the extra /bin directory to "/bin2," moving all the files from within /bin2 to the main /hive directory, and removing the unnecessary and now empty /bin2 directory. 我通过将多余的/ bin目录重命名为“ / bin2”,将所有文件从/ bin2内移至主/ hive目录,并删除了不必要的,现在为空的/ bin2目录,解决了此问题。 Once the .jar file was in the correct directory, there were no problems running Hive! 一旦.jar文件位于正确的目录中,运行Hive就不会有问题! Here are the commands I used: 这是我使用的命令:

    cd /usr/local
    mv hive/bin hive/bin2
    mv hive/bin2/* hive
    rm -r hive/bin2

I did below in Dec 2017 and it worked. 我在2017年12月在下面做了,它奏效了。

  1. Copied hive in to hadoop_home directory 将配置单元复制到hadoop_home目录
  2. Did below in cygwin: 在cygwin中做了以下内容:

     export HIVE_HOME=$HADOOP_HOME/hive export PATH=$HIVE_HOME/bin:$PATH 

The question is about that the hive path, So you can check up all configuration file involving the hive path. 问题是关于配置单元路径,因此您可以检查涉及配置单元路径的所有配置文件。 Remember that you must confirm that the hadoop had been installed. 请记住,您必须确认已经安装了Hadoop。

1, the environment parameter(/etc/profile or ~/.profile) 1,环境参数(/ etc / profile或〜/ .profile)

export HIVE_HOME=/usr/app/apache-hive-2.3.0-bin
export PATH=$HIVE_HOME/bin:$PATH

2, $HIVE_HOME/conf/hive-env.sh 2,$ HIVE_HOME / conf / hive-env.sh

export JAVA_HOME= ${Your_JAVA_HOME_directory}
export HADOOP_HOME= ${Your_HADOOP_HOME_directory}
export HIVE_HOME= ${Your_HIVE_HOME_directory}
export HIVE_CONF_DIR= ${Your_HIVE_HOME_directory}/conf

Hive is based on Hadoop, so you must configure the hadoop's path on the hive-env.sh. Hive基于Hadoop,因此您必须在hive-env.sh上配置hadoop的路径。

  1. Try cross checking your environment variable path, in case you typed it wrong. 尝试交叉检查您的环境变量路径,以防您键入错误。

  2. Try Reloading .bashrc by typing following command 尝试通过键入以下命令来重新加载.bashrc

    source ~/.bashrc

  3. Try rebooting your machine 尝试重新启动计算机

2021 - Had a similar issue because of misconfigured $HIVE_HOME env variable 2021 年 - 由于 $HIVE_HOME 环境变量配置错误而出现类似问题

Specifically I had:具体来说,我有:

$HIVE_HOME="/usr/local/hadoop/hive/ bin " $HIVE_HOME="/usr/local/hadoop/hive/ bin "

So hive was looking for lib/hive-exec- .jar in $HIVE_HOME/lib/hive-exec- .jar which is wrong所以 hive在 $HIVE_HOME/lib/hive-exec- .jar 中寻找 lib/hive- exec- .jar 这是错误的


Working configuration is therefore:因此,工作配置是:

$HIVE_HOME="/usr/local/hadoop/hive"  # Or wherever you hive is installed
export PATH=${HIVE_HOME}/bin:${PATH}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 缺少Hive Execution Jar:/usr/local/apache-hive-2.1.0-bin/lib/hive-exec-*.jar - Missing Hive Execution Jar: /usr/local/apache-hive-2.1.0-bin/lib/hive-exec-*.jar 缺少Hive Execution Jar Hadoop - Missing Hive Execution Jar Hadoop Hive中的Hive / lib / hive-builtins-0.9.0.jar的FileNotFoundException - FileNotFoundException for hive/lib/hive-builtins-0.9.0.jar in Hive /usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-*.jar 不存在 - /usr/lib/hive-hcatalog/share/hcatalog/hive-hcatalog-core-*.jar does not exist 由于缺少jar,无法配置hive.exec挂钩 - Unable to configure hive.exec hooks due to missing jar Spark上的蜂巢:失踪<spark-assembly*.jar> - Hive on Spark: Missing <spark-assembly*.jar> Hive:失败:执行错误,从org.apache.hadoop.hive.ql.exec.mr.MapRedTask返回代码2 - Hive : FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask 来自 org.apache.hadoop.hive.ql.exec.mapredtask 的配置单元执行错误返回代码 2 失败 - hive failed execution error return code 2 from org.apache.hadoop.hive.ql.exec.mapredtask 将配置单元中“ select”的输出作为Hadoop jar输入文件的输入 - Take the output of “select” in hive as the input of Hadoop jar input file Hadoop Hive - 如何“添加jar”以与Hive JDBC客户端一起使用? - Hadoop Hive - How can I 'add jar' for use with the Hive JDBC client?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM