简体   繁体   中英

Connect with a secured Hbase using a java client

I am trying to connect with a securized hbase with kerberos. It is an hbase deployed into a hdp3 cluster. Exactly, I am trying to access with a java client from a host that is outside the cluster.

This is my code:

System.setProperty("java.security.krb5.conf","/etc/krb5.conf");
        System.setProperty("sun.security.krb5.debug", "true");
        System.setProperty("java.security.debug", "gssloginconfig,configfile,configparser,logincontext");
        System.setProperty("java.security.auth.login.config", "hbase.conf");

        Configuration conf = HBaseConfiguration.create();

        String principal="user@REALM";
        File keytab = new File("/home/user/user.keytab");

        UserGroupInformation.setConfiguration(conf);
        UserGroupInformation ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI(principal, keytab.getAbsolutePath());

        ugi.doAs(new PrivilegedAction<Void>() {
            @Override
            public Void run() {

                try {

                    TableName tableName = TableName.valueOf("some_table");
                    final Connection conn = ConnectionFactory.createConnection(conf);
                    System.out.println(" go ");
                    Table table = conn.getTable(tableName);
                    Result r = table.get(new Get(Bytes.toBytes("some_key")));
                    System.out.println(r);

                } catch (IOException e) {
                    e.printStackTrace();
                }

                return null;
            }
        });

    }

This is my jaas file conf:

Client {
  com.sun.security.auth.module.Krb5LoginModule required
  useKeyTab=true
  useTicketCache=false
  keyTab="/home/user/user.keytab"
  principal="user@REALM";
};

All the zookeeper and other config is taken from the hbase-site.xml file provided by ambari.

I get no error just the client get in a infinite loop with a trace like:

ReadOnlyZKClient-node2:2181,node3:2181,node4:2181@0x50ad3bc1-SendThread(node4:2181)] DEBUG org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x371f62d9b230031, packet:: clientPath:/hbase-secure/meta-region-server serverPath:/hbase-secure/meta-region-server finished:false header:: 141,4 replyHeader:: 141,365072222881,0 request:: '/hbase-secure/meta-region-server,F response:: #ffffffff000146d61737465723a313630303019fffffff6ffffff864dffffff99ffffff85151c50425546a11a56e6f64653410ffffff947d18ffffffb0ffffffa6ffffff81ffffffc5ffffff9f2e100183,s{365072220963,365072222074,1588973398227,1589014218472,5,0,0,0,52,0,365072220963} [ReadOnlyZKClient-node2:2181,node3:2181,node4:2181@0x50ad3bc1-SendThread(node4:2181)] DEBUG org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x371f62d9b230031, packet:: clientPath:/hbase-secure/meta-region-server serverPath:/hbase-secure/meta-region-server finished:false Z09 9FB995346F31C749F6E40DB0F395E3Z:: 142,4 replyHeader:: 142,365072222881,0 request:: '/hbase-secure/meta-region-server,F response:: #ffffffff000146d61737465723a313630303019fffffff6ffffff864dffffff99ffffff85151c50425546a11a56e6f64653410ffffff947d18ffffffb0ffffffa6ffffff81ffffffc5ffffff9f2e100183,s{365072220963,365072222074,1588973398227,1589014218472,5,0,0,0,52,0,365072220963} [ReadOnlyZKClient-node2:2181,node3:2181,node4:2181@0x50ad3bc1-SendThread(node4:2181)] DEBUG org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x371f62d9b230031, packet:: clientPath:/hbase-secure/meta-region-server serverPath:/hbase-secure/meta-region-server finished:false header:: 143,4 replyHeader:: 143,365072222881,0 request:: '/hbase-secure/meta-region-server,F response:: #ffffffff000146d61737465723a313630303019fffffff6ffffff864dffffff99ffffff85151c50425546a11a56e6f64653410ffffff947d18ffffffb0ffffffa6ffffff81ffffffc5ffffff9f2e100183,s{365072220963,36 5072222074,1588973398227,1589014218472,5,0,0,0,52,0,365072220963}

EDIT

OK i got this error just I was not waited enougth:

Exception in thread "main" java.net.SocketTimeoutException: callTimeout=1200000, callDuration=2350283: Failed after attempts=36, exceptions:
Mon May 11 13:53:42 CEST 2020, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=70631: Call to slave-5.cluster/172.10.96.43:16020 failed on local exception: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] row 'tome_table,some_key,99999999999999' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave-5.cluster/172.10.96.43:16020,16020,1588595144765, seqNum=-1
 row 'row_key' on table 'some_table' at null
    at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
    at org.apache.hadoop.hbase.client.HTable.get(HTable.java:386)
    at org.apache.hadoop.hbase.client.HTable.get(HTable.java:360)
    at internal.holly.devoptools.hbase.HBaseCli.main(HBaseCli.java:77)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=36, exceptions:
Mon May 11 13:53:42 CEST 2020, null, java.net.SocketTimeoutException: callTimeout=60000, callDuration=70631: Call to slave-5.cluster/172.10.96.43:16020 failed on local exception: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] row 'some_table,some_key,99999999999999' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=slave-5.cluster,16020,1588595144765, seqNum=-1

    at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.throwEnrichedException(RpcRetryingCallerWithReadReplicas.java:298)
    at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:242)
    at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:58)
    at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:192)
    at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:269)
    at org.apache.hadoop.hbase.client.ClientScanner.loadCache(ClientScanner.java:437)
    at org.apache.hadoop.hbase.client.ClientScanner.nextWithSyncCache(ClientScanner.java:312)
    at org.apache.hadoop.hbase.client.ClientScanner.next(ClientScanner.java:597)
    at org.apache.hadoop.hbase.client.ConnectionImplementation.locateRegionInMeta(ConnectionImplementation.java:856)
    at org.apache.hadoop.hbase.client.ConnectionImplementation.locateRegion(ConnectionImplementation.java:759)
    at org.apache.hadoop.hbase.client.ConnectionImplementation.locateRegion(ConnectionImplementation.java:745)
    at org.apache.hadoop.hbase.client.ConnectionImplementation.locateRegion(ConnectionImplementation.java:716)
    at org.apache.hadoop.hbase.client.ConnectionImplementation.getRegionLocation(ConnectionImplementation.java:594)
    at org.apache.hadoop.hbase.client.HRegionLocator.getRegionLocation(HRegionLocator.java:72)
    at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:223)
    at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:105)
    ... 3 more

Thanks.

Finally adding this prop it worked:

        conf.set("hadoop.security.authentication", "kerberos");

And this is my final code:

    public static void main(String[] args) throws IOException, InterruptedException {
        System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");

        Configuration conf = HBaseConfiguration.create();
        conf.set("hadoop.security.authentication", "kerberos");

        String principal = "user@REALM";

        UserGroupInformation.setConfiguration(conf);
        UserGroupInformation ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI(principal, "/home/user/principal.keytab");

        Connection conn = ugi.doAs(new PrivilegedExceptionAction<Connection>() {
            @Override
            public Connection run() throws Exception {
                return ConnectionFactory.createConnection(conf);
            }
        });

        TableName tableName = TableName.valueOf("some_table");
        Table table = conn.getTable(tableName);
        Result r = table.get(new Get(Bytes.toBytes("some_key")));

        System.out.println("result: " + r);

    }

I have had the same issue; in my case, while I was submitting the spark job, I had included the hadoop* and hbase* jars in my spark-submit command; After a some inspections, I noticed that I had included hadoop* and hbase* jars which do not match the same versions of the hbase/hadoop in my yarn cluster. It was a minor difference in those jars, but it messed up the kerberos authentication

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM