简体   繁体   English

Hadoop Map-Reduce输出文件异常

[英]Hadoop Map-Reduce Output File Exception

I am getting this error on runing a single node hadoop cluster on amazon d2.2Xlarge.I also cannot view my output.Can anyone provide me with the proper steps to resolve this issue? 我在亚马逊d2.2Xlarge上运行单节点hadoop集群时遇到此错误。我也无法查看我的输出。有人可以为我提供解决此问题的适当步骤吗?

"Caused by: org.apache.hadoop.util.DiskChecker$DiskErrorException: Could not
 find any valid local directory for output/file.out"

This are my steps executed. 这是我执行的步骤。

bin/hdfs dfsadmin -safemode leave                            
bin/hadoop fs -mkdir /inputfiles    
bin/hadoop dfsadmin -safemode leave    
bin/hadoop fs -mkdir /output    
bin/hdfs dfsadmin -safemode leave       
bin/hadoop fs -put input1 /inputfiles    
bin/hdfs dfsadmin -safemode leave   
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar  
wordcount /inputfiles /output

You should not create output directory for Map Reduce job. 您不应该为Map Reduce作业创建输出目录。

Remove this command 删除此命令

bin/hadoop fs -mkdir /output  

And change last command as 并将最后一个命令更改为

bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.2.jar  
wordcount /inputfiles /output1

Make sure that you have permissions to create output1 under / 确保您有权在/下创建output1

If not, I would prefer below directory strucuture. 如果没有,我更喜欢下面的目录结构。

/home/your_user_name/input for input directory files and /home/your_user_name/input用于输入目录文件和

/home/your_user_name/output for output directory. /home/your_user_name/output用于输出目录。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM