简体   繁体   中英

Kerberos Java Credentials Cache

I am running following code to interact with the kerberos enabled hadoop cluster.

val t1 = new Thread() {
            override def run() {
                println("output of first thread")
                val conf = new Configuration
                conf.set("hadoop.security.authentication", "Kerberos")
                conf.set("fs.defaultFS", "hdfs://192.168.23.206:8020")
                UserGroupInformation.setConfiguration(conf)
                UserGroupInformation.loginUserFromKeytab("dummy@platalyticsrealm", "E:\\\\dummy.keytab");
                val fs = FileSystem.get(conf);
                val status = fs.listStatus(new Path("/"))

                println(UserGroupInformation.getLoginUser().getShortUserName())
            }
        }
val t2 = new Thread() {
            override def run() {
                println("Running Thread 2")
                val conf = new Configuration
                conf.set("hadoop.security.authentication", "Kerberos")
                conf.set("fs.defaultFS", "hdfs://192.168.23.206:8020")
                UserGroupInformation.setConfiguration(conf)
                UserGroupInformation.loginUserFromKeytab("test@platalyticsrealm", "E:\\\\test.keytab");
                val fs = FileSystem.get(conf);
                val status = fs.listStatus(new Path("/"))

                println(UserGroupInformation.getLoginUser().getShortUserName())

            }
        }
        t1.start
        Thread.sleep(5000)
        t2.start

This code produces following output.

test

test

It means the second thread over-write the credentials obtained by first thread. I have following questions 1. Where credentials are stored in my windows environment. I searched under C:\\Users\\username but i did not find. 2. How can i tackle this problem of over-writing credentials cache when multiple users try to access hadoop at a time.

Thanks

Your Java code clearly uses static methods to set the default, implicit, global, JVM-wide UGI. That's what people need 99% of the time.

But if you need to serve multiple sessions for multiple users, in client-server mode, then clearly that cannot work. Please read that tutorial chosen at random on a Google search, under section "Multiple UGIs" . Then do some research by yourself.

If you want to dig into the dirty implementation details, you might peek into that awe-inspiring grimoire by the guy who actually maintains the Hadoop security code base (also the Spark code base and the ZK code base) and is not too happy about that.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM