[英]How to set up hadoop environment variables
I am following this page in the official documentation. 我在官方文档中关注此页面 。 I also downloaded hadoop 2.2.0 and placed it under
$HOME/opt
. 我还下载了hadoop 2.2.0并将其放在
$HOME/opt
。 Now I have this file structure: 现在我有这个文件结构:
$ ls -1 ~/opt/hadoop-2.2.0/
LICENSE.txt
NOTICE.txt
README.txt
bin/
etc/
include/
lib/
libexec/
sbin/
share/
$ ls -1 ~/opt/hadoop-2.2.0/share/hadoop/
common/
hdfs/
httpfs/
mapreduce/
tools/
yarn/
In the page I mentioned above, there is this paragraph: 在我上面提到的页面中,有这一段:
Assuming you have installed hadoop-common/hadoop-hdfs and exported $HADOOP_COMMON_HOME/$HADOOP_HDFS_HOME, untar hadoop mapreduce tarball and set environment variable $HADOOP_MAPRED_HOME to the untarred directory.
假设您已安装hadoop-common / hadoop-hdfs并导出$ HADOOP_COMMON_HOME / $ HADOOP_HDFS_HOME,请解压缩hadoop mapreduce tarball并将环境变量$ HADOOP_MAPRED_HOME设置为untarred目录。 Set $HADOOP_YARN_HOME the same as $HADOOP_MAPRED_HOME.
设置$ HADOOP_YARN_HOME与$ HADOOP_MAPRED_HOME相同。
So, my question is, given my file structure, how should I set up the hadoop environment variables ($HADOOP_COMMON_HOME, $HADOOP_HDFS_HOME, $HADOOP_YARN_HOME, etc)? 所以,我的问题是,根据我的文件结构,我应该如何设置hadoop环境变量($ HADOOP_COMMON_HOME,$ HADOOP_HDFS_HOME,$ HADOOP_YARN_HOME等)? Thank you very much.
非常感谢你。
You could set them as following, last 2 are not mandatory to get HDFS and YARN working though. 您可以将它们设置为以下,但最后2个并非强制要求HDFS和YARN正常工作。
HADOOP_COMMON_HOME=$HOME/opt/hadoop-2.2.0/
HADOOP_HDFS_HOME=$HOME/opt/hadoop-2.2.0/share/hadoop/hdfs
HADOOP_YARN_HOME=$HOME/opt/hadoop-2.2.0/share/hadoop/yarn
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.