简体   繁体   English

Hadoop MAC OS安装问题

[英]Hadoop MAC OS installation woes

So I'm trying to install hadoop on MAC OS X Leopard following the steps in this note: Running Hadoop on a OS X Single Node Cluster . 因此,我尝试按照此注释中的步骤在MAC OS X Leopard上安装hadoop: 在OS X Single Node Cluster上运行Hadoop

I reached Step 4: Formatting and running Hadoop, where I entered the following: 我到达了步骤4:格式化和运行Hadoop,在其中输入了以下内容:

hadoop-*/bin/hadoop namenode -format

This produced the following unpleasant output: 这产生了以下令人不愉快的输出:

Macbook009:~ Hadoop$ hadoop-*/bin/hadoop namenode -format
    Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:676)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:317)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:280)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:375)
Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad version number in .class file
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:676)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:317)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:280)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
    at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:375)

I did some googling, and learned that JAVA_HOME may be set incorrectly. 我进行了一些谷歌搜索,并了解到JAVA_HOME可能不正确。 I created a .bash_profile file like this: 我创建了一个.bash_profile文件,如下所示:

export JAVA_HOME=/system/Library/Frameworks/JavaVM.framework/Versions/1.6/Home
export HADOOP_HOME=~/Users/Hadoop/hadoop-0.20.203.0

export PATH=$HADOOP_HOME/bin:$PATH

No go. 不行 Same freaking error. 同样的错误。 What am I doing wrong? 我究竟做错了什么?

I suspect that the JVM that's actually running Hadoop is not the expected one, but an older one (Java 5). 我怀疑实际运行Hadoop的JVM不是预期的JVM,而是较旧的JVM(Java 5)。 Verify this by running ps (or any Mac equivalent) and examining the command line. 通过运行ps (或任何等效于Mac的Mac)并检查命令行来验证这一点。

Try to set JAVA_HOME in $HADOOP_HOME/conf/hadoop-env.sh to the same path as you did in your .bash_profile . 尝试将$HADOOP_HOME/conf/hadoop-env.sh中的JAVA_HOME设置为与.bash_profile相同的路径。

It may still be a problem with setting your JAVA_HOME as it could be different from examples on the web. 设置JAVA_HOME可能仍然是一个问题,因为它可能与Web上的示例不同。 Use this command in your terminal to find your JAVA_HOME directory /usr/libexec/java_home 在终端中使用此命令查找您的JAVA_HOME目录/ usr / libexec / java_home

This is caused when the jre you are running is older than the jre that compiled the class files. 这是由于您正在运行的jre早于编译类文件的jre而引起的。 For example running 1.6 compiled java with the 1.5 jre. 例如,以1.5 jre运行1.6编译的Java。 If the class was compiled just for 1.6 it will not work with 1.5. 如果该类仅针对1.6进行编译,则无法与1.5一起使用。

Do a 做一个

  java -version

and see which jre you have. 看看你有哪个孩子。 Most likely you have an old one and need to upgrade it. 很可能您有一个旧的,需要升级。

Try typing jps and seeing how many nodes are actually running . 尝试输入jps并查看实际正在运行多少个节点。 There should be 6 of them . 应该有6个。 You shouldn't have a problem hopefully . 希望你不会有问题。

Following steps worked seamlessly for me: 以下步骤对我来说是无缝的:

http://ragrawal.wordpress.com/2012/04/28/installing-hadoop-on-mac-osx-lion http://ragrawal.wordpress.com/2012/04/28/installing-hadoop-on-mac-osx-lion

For wordcount example, you need to copy your file to hdfs, for which you can find the command here: (this is the only step where I struggled after following the above page). 对于wordcount示例,您需要将文件复制到hdfs,您可以在以下命令中找到该命令:(这是我在遵循上一页之后的唯一努力)。

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/#running-a-mapreduce-job http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/#running-a-mapreduce-job

However for newer versions of OS X like 10.9 setting PATH described above is not working. 但是,对于OS X的较新版本(如10.9),上述设置PATH不起作用。

Found a way where you can add default PATH environment for MAC in file /etc/paths. 找到了一种可以在文件/ etc / paths中为MAC添加默认PATH环境的方法。

Open this file using Terminal in SUDO mode. 在SUDO模式下使用Terminal打开此文件。

$ sudo nano /etc/paths (enter password when prompted). $ sudo nano / etc / paths(在提示时输入密码)。

Append the path in below format. 以以下格式附加路径。

/users/hadoop/hadoop-1.2.1/bin /users/hadoop/hadoop-1.2.1/bin

/users/hadoop/hadoop-1.2.1/sbin /users/hadoop/hadoop-1.2.1/sbin

save the file and restart the Machine. 保存文件并重新启动计算机。 Next time no need to type whole command to run Hadoop script command from Script. 下次无需键入整个命令即可从Script运行Hadoop脚本命令。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM