[英]Hadoop 2.2.0 jar files missing
我已經安裝了hadoop-2.2.0,並試圖運行它附帶的mapreduce示例代碼。 但是,每次使用ClassNotFoundException
都會失敗,而我發現的原因是hadoop.sh
文件中設置的hadoop.sh
。 以下是sh文件中提供的內容,安裝中未捆綁任何類文件。 我確實看到它們出現在源代碼中。
if [ "$COMMAND" = "fs" ] ; then
CLASS=org.apache.hadoop.fs.FsShell
elif [ "$COMMAND" = "version" ] ; then
CLASS=org.apache.hadoop.util.VersionInfo
elif [ "$COMMAND" = "jar" ] ; then
CLASS=org.apache.hadoop.util.RunJar
elif [ "$COMMAND" = "checknative" ] ; then
CLASS=org.apache.hadoop.util.NativeLibraryChecker
elif [ "$COMMAND" = "distcp" ] ; then
CLASS=org.apache.hadoop.tools.DistCp
CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [ "$COMMAND" = "daemonlog" ] ; then
CLASS=org.apache.hadoop.log.LogLevel
elif [ "$COMMAND" = "archive" ] ; then
CLASS=org.apache.hadoop.tools.HadoopArchives
CLASSPATH=${CLASSPATH}:${TOOL_PATH}
elif [[ "$COMMAND" = -* ]] ; then
# class and package names cannot begin with a -
echo "Error: No command named \`$COMMAND' was found. Perhaps you meant \`hadoop ${COMMAND#-}'"
exit 1
else
CLASS=$COMMAND
這是錯誤:
Exception in thread "main" java.lang.NoClassDefFoundError: org.apache.hadoop.util.RunJar
at gnu.java.lang.MainThread.run(libgcj.so.13)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.RunJar not found in gnu.gcj.runtime.SystemClassLoader{urls=[file:/usr/local/hadoop-2.2.0/etc/hadoop/,file:/usr/local/hadoop-2.2.0/share/hadoop/hdfs/], parent=gnu.gcj.runtime.ExtensionClassLoader{urls=[], parent=null}}
at java.net.URLClassLoader.findClass(libgcj.so.13)
at gnu.gcj.runtime.SystemClassLoader.findClass(libgcj.so.13)
at java.lang.ClassLoader.loadClass(libgcj.so.13)
at java.lang.ClassLoader.loadClass(libgcj.so.13)
at gnu.java.lang.MainThread.run(libgcj.so.13)
我終於弄清楚了這個問題。 YARN(用於mapreduce)和DFS進程需要在后台運行才能運行任何hadoop作業。 我想念那些成為Hadoop的n00b的人。 要啟動這兩個進程,請在命令窗口中鍵入start-yarn
和start-dfs
。 他們每個都啟動2個控制台窗口,並吐出很多診斷信息。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.