简体   繁体   English

尝试以myuser身份运行作业时,权限被拒绝,但错误显示root用户没有足够的权限

[英]Permission denied when trying to run a job as myuser, but error says root doesn't have enough permissions

I'm trying to get Hadoop MapReduce working, but keep facing access problems. 我正在尝试使Hadoop MapReduce工作,但始终面临访问问题。 I'm not sure what the problem is, whether it is configuration or something else. 我不确定是什么问题,无论是配置还是其他问题。

When I'm running, for example, this wordcount 例如,当我跑步时,此字数统计

hadoop jar /usr/share/hadoop/hadoop-examples-1.2.1.jar wordcount /user/myuser/input.txt /user/myuser/output.txt hadoop jar /usr/share/hadoop/hadoop-examples-1.2.1.jar wordcount /user/myuser/input.txt /user/myuser/output.txt

I'm getting the following error: 我收到以下错误:

14/09/10 20:15:51 INFO input.FileInputFormat: Total input paths to process : 1
14/09/10 20:15:51 INFO mapred.JobClient: Running job: job_201409101946_0010
14/09/10 20:15:52 INFO mapred.JobClient:  map 0% reduce 0%
14/09/10 20:15:52 INFO mapred.JobClient: Task Id : attempt_201409101946_0010_m_000002_0,         Status : FAILED
Error initializing attempt_201409101946_0010_m_000002_0:
org.apache.hadoop.security.AccessControlException:         
org.apache.hadoop.security.AccessControlException: Permission denied: user=root, 
access=EXECUTE, inode="job_201409101946_0010":hadoop:supergroup:rwx------
at sun.reflect.GeneratedConstructorAccessor7.newInstance(Unknown Source)
...

Obviously, user=root cannot access directory owned by the user hadoop. 显然,user = root无法访问用户hadoop拥有的目录。 But the problem is that I'm running the job as myuser and I'm not sure why root is involved here at all. 但是问题是我以我的用户身份运行该作业,而且我不确定为什么根本涉及root用户。 Do you know what can be causing this issue? 您知道什么可能导致此问题吗?

First of all, /user/myuser/ should be a path on HDFS, not a local path. 首先, /user/myuser/应该是HDFS上的路径,而不是本地路径。 Then, it needs to exist, so, if it doesn't, run: 然后,它需要存在,因此,如果不存在,请运行:

hadoop dfs -mkdir /user/myuser/ and then hadoop dfs -mkdir /user/myuser/ ,然后
hadoop dfs -chown myuser:groopOfMyuser /user/myuser/ , where groupOfMyuser is the group to which myuser belongs. hadoop dfs -chown myuser:groopOfMyuser /user/myuser/ ,其中groupOfMyusermyuser所属的组。

To check whether it exists run: hadoop dfs -ls /user/ 要检查它是否存在,请运行: hadoop dfs -ls /user/

Then, to upload your files to the input dir on HDFS, use the command: 然后,要将文件上传到HDFS的输入目录,请使用以下命令:

hadoop dfs -copyFromLocal /local/path/input.txt /user/myuser/

Note that input and output paths should be directories (on HDFS) and not files. 请注意,输入和输出路径应该是目录(在HDFS上),而不是文件。 So, the correct command to run your programm would be: 因此,运行程序的正确命令将是:

hadoop jar /usr/share/hadoop/hadoop-examples-1.2.1.jar wordcount /user/myuser /user/myuser/output

Finally, check if the hadoop dirs ( $HADOOP_HOME ) are owned by myuser or hadoop or whatever and run the jars as the one who owns them, or change their ownership with chown . 最后,检查hadoop dirs( $HADOOP_HOME )是否由myuserhadoop或其他任何人拥有,并以拥有jar的人的身份运行jar,或者使用chown更改其所有权。

(Assuming you use the old API, but you can easily find the equivalent commands in the new API, as well.) (假设您使用的是旧API,但是您也可以轻松地在新API中找到等效的命令。)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM