简体   繁体   中英

HDFS INotify and Kerberos authentication in Java client

I'm using this example from Internet: hdfs-inotify-example , build completes with no error but execution ends with error:

Exception in thread "main" org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]

Kerberos system is working and I have a fresh Kerberos ticket which works perfectly. So I'm not so sure that this is a problem about Kerberos. I also set this env var:

export HADOOP_CONF_DIR=/etc/hadoop/conf  

which point to core-site.xml where the security settings, AFAIK, are correct:

  <property>
    <name>hadoop.security.authentication</name>
  <value>kerberos</value>
  </property>
  <property>
     <name>hadoop.security.authorization</name>
  <value>true</value>
  </property>
  <property>
     <name>hadoop.rpc.protection</name>
  <value>authentication</value>
 </property>

What's going wrong? Every suggestion is appreciated ( a lot ).

I'm using Hadoop 2.6.0-cdh5.10.1

Finally, I found how to solve the issue:

This is the patch I added to hdfs-inotify-example

diff --git a/src/main/java/com/onefoursix/HdfsINotifyExample.java b/src/main/java/com/onefoursix/HdfsINotifyExample.java
index 97ac409..32321b1 100644
--- a/src/main/java/com/onefoursix/HdfsINotifyExample.java
+++ b/src/main/java/com/onefoursix/HdfsINotifyExample.java
@@ -11,6 +11,7 @@ import org.apache.hadoop.hdfs.inotify.Event.CreateEvent;
 import org.apache.hadoop.hdfs.inotify.Event.UnlinkEvent;
 import org.apache.hadoop.hdfs.inotify.EventBatch;
 import org.apache.hadoop.hdfs.inotify.MissingEventsException;
+import org.apache.hadoop.security.UserGroupInformation;

 public class HdfsINotifyExample {

@@ -21,10 +22,20 @@ public class HdfsINotifyExample {
                if (args.length > 1) {
                        lastReadTxid = Long.parseLong(args[1]);
                }
-
+        
+                System.out.println("com.onefoursix.HdfsINotifyExample.main()");
                System.out.println("lastReadTxid = " + lastReadTxid);
-
-               HdfsAdmin admin = new HdfsAdmin(URI.create(args[0]), new Configuration());
+                Configuration config = new Configuration();
+                
+                config.set("hadoop.security.authentication", "kerberos");
+                config.set("hadoop.security.authorization", "true");
+                config.set("dfs.namenode.kerberos.principal", "hdfs/_HOST@AD.XXXXX.COM");
+                config.set("dfs.namenode.kerberos.principal.pattern", "hdfs/*@AD.XXXXX.COM");
+                
+                UserGroupInformation.setConfiguration(config);
+                System.out.println("Security enabled " + UserGroupInformation.isSecurityEnabled());
+                
+               HdfsAdmin admin = new HdfsAdmin(URI.create(args[0]), config);

                DFSInotifyEventInputStream eventStream = admin.getInotifyEventStream(lastReadTxid);

Authentication worked fine. Finally I got:

Exception in thread "main" org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Access denied for user xxxxxxxx. Superuser privilege is required

that is telling me the current user is not enabled to see what is happening in the Datanode, but this is another history.

Adding to ozw1z5rd's answer:

Please login to super user(hdfs) and execute the program there.

$ sudo -i -u hdfs
$ cp shaded-fat-jar.jar /home/hdfs

and run the program from jar file copied to hdfs home.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM