简体   繁体   中英

Hadoop authentication with Kerberos error

I am trying to create files in HDFS using:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

To do so I add configs like this:

Configuration configuration = new Configuration();

configuration.set("fs.hdfs.impl",
        org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()
);
configuration.set("fs.file.impl",
        org.apache.hadoop.fs.LocalFileSystem.class.getName()
);

OutputStream fileout1 = new FileOutputStream("CONF_before.XML");
configuration.writeXml(fileout1);

configuration.addResource(new Path("/etc/hive/conf.cloudera.hive/hdfs-site.xml"));
configuration.addResource(new Path("/etc/hive/conf.cloudera.hive/core-site.xml"));
OutputStream fileout = new FileOutputStream("CONF_after.XML");
configuration.writeXml(fileout);
FileSystem hdfs = FileSystem.get(configuration);

Path out_path = new Path(hdfs.getWorkingDirectory() + "/OD.xml");
OutputStream os = hdfs.create(out_path);

When I run this code, I've get an error in OutputStream os = hdfs.create(out_path) :

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN,KERBEROS]

but if I add the core-site.xml to a project artefact and run it on a server there are no errors.

The output configurations in both cases are same. Relevant part of core-site.xml is:

 <property>
    <name>hadoop.security.authentication</name>
    <value>kerberos</value>
  </property>
  <property>
    <name>hadoop.security.authorization</name>
    <value>false</value>
  </property>
  <property>
    <name>hadoop.rpc.protection</name>
    <value>authentication</value>
  </property>

Any ideas why it happend? Thanks!

try add this to hdfs-site.xml

 <property> <name>ipc.client.fallback-to-simple-auth-allowed</name> <value>true</value> </property> 

I faced this problem too, and it turned out that:

configuration.addResource(new Path("..."))

was not loading the file.

I did not traced the cause, but I know that switching to the overloaded method accepting an InputStream did work:

configuration.addResource(new FileInputStream(new File("...")))

You wrote that if you add the XMLs to your JAR resources solves the problem - it is because by default Configuration class looks for two XML files in your class path and tries to load them. For quick reference, a fragment of the Configuration class implementation:

addDefaultResource("core-default.xml");
addDefaultResource("core-site.xml");

According the given error message RemoteException ... AccessControlException) ... SIMPLE authentication is not enabled. Available:[TOKEN,KERBEROS] RemoteException ... AccessControlException) ... SIMPLE authentication is not enabled. Available:[TOKEN,KERBEROS] and the configuration property hadoop.security.authentication = kerberos it seems that you are using a Kerberos secured cluster, whereby your client to access HDFS is not using this configuration and trying to do simple authentication.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM