简体   繁体   English

Flink SQL 客户端连接到安全的 kafka 集群

[英]Flink SQL Client connect to secured kafka cluster

I want to execute a query on Flink SQL Table backed by kafka topic of secured kafka cluster.我想对 Flink SQL 表执行查询,该表由安全 kafka 集群的 kafka 主题支持。 I'm able to execute the query programmatically but unable to do the same through Flink SQL client.我能够以编程方式执行查询,但无法通过 Flink SQL 客户端执行相同的操作。 I'm not sure on how to pass JAAS config ( java.security.auth.login.config ) and other system properties through Flink SQL client.我不确定如何通过 Flink SQL 客户端传递 JAAS 配置( java.security.auth.login.config )和其他系统属性。

Flink SQL query programmatically Flink SQL 以编程方式查询

 private static void simpleExec_auth() {

        // Create the execution environment.
        final EnvironmentSettings settings = EnvironmentSettings.newInstance()
                .inStreamingMode()
                .withBuiltInCatalogName(
                        "default_catalog")
                .withBuiltInDatabaseName(
                        "default_database")
                .build();

        System.setProperty("java.security.auth.login.config","client_jaas.conf");
        System.setProperty("sun.security.jgss.native", "true");
        System.setProperty("sun.security.jgss.lib", "/usr/libexec/libgsswrap.so");
        System.setProperty("javax.security.auth.useSubjectCredsOnly","false");

        TableEnvironment tableEnvironment = TableEnvironment.create(settings);
        String createQuery = "CREATE TABLE  test_flink11 ( " + "`keyid` STRING, " + "`id` STRING, "
                + "`name` STRING, " + "`age` INT, " + "`color` STRING, " + "`rowtime` TIMESTAMP(3) METADATA FROM 'timestamp', " + "`proctime` AS PROCTIME(), " + "`address` STRING) " + "WITH ( "
                + "'connector' = 'kafka', "
                + "'topic' = 'test_flink10', "
                + "'scan.startup.mode' = 'latest-offset', "
                + "'properties.bootstrap.servers' = 'kafka01.nyc.com:9092', "
                + "'value.format' = 'avro-confluent', "
                + "'key.format' = 'avro-confluent', "
                + "'key.fields' = 'keyid', "
                + "'value.fields-include' = 'EXCEPT_KEY', "
                + "'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.kerberos.service.name' = 'kafka', 'properties.sasl.kerberos.kinit.cmd' = '/usr/local/bin/skinit --quiet', 'properties.sasl.mechanism' = 'GSSAPI', "
                + "'key.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', "
                + "'key.avro-confluent.schema-registry.subject' = 'test_flink6', "
                + "'value.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', "
                + "'value.avro-confluent.schema-registry.subject' = 'test_flink4')";
        System.out.println(createQuery);
        tableEnvironment.executeSql(createQuery);
        TableResult result = tableEnvironment
                .executeSql("SELECT name,rowtime FROM test_flink11");
        result.print();
    }

This is working fine.这工作正常。

Flink SQL query through SQL client Flink SQL 通过 SQL 客户端查询

Running this giving the following error.运行此给出以下错误。

Flink SQL> CREATE TABLE test_flink11 (`keyid` STRING,`id` STRING,`name` STRING,`address` STRING,`age` INT,`color` STRING) WITH('connector' = 'kafka', 'topic' = 'test_flink10','scan.startup.mode' = 'earliest-offset','properties.bootstrap.servers' = 'kafka01.nyc.com:9092','value.format' = 'avro-confluent','key.format' = 'avro-confluent','key.fields' = 'keyid', 'value.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', 'value.avro-confluent.schema-registry.subject' = 'test_flink4', 'value.fields-include' = 'EXCEPT_KEY', 'key.avro-confluent.schema-registry.url' = 'http://kafka-schema-registry:5037', 'key.avro-confluent.schema-registry.subject' = 'test_flink6', 'properties.security.protocol' = 'SASL_PLAINTEXT', 'properties.sasl.kerberos.service.name' = 'kafka', 'properties.sasl.kerberos.kinit.cmd' = '/usr/local/bin/skinit --quiet', 'properties.sasl.mechanism' = 'GSSAPI');

Flink SQL> select * from test_flink11;
[ERROR] Could not execute SQL statement. Reason:
java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config' is /tmp/jaas-6309821891889949793.conf

There is nothing in /tmp/jaas-6309821891889949793.conf except the following comment /tmp/jaas-6309821891889949793.conf中没有任何内容,除了以下注释

# We are using this file as an workaround for the Kafka and ZK SASL implementation
# since they explicitly look for java.security.auth.login.config property
# Please do not edit/delete this file - See FLINK-3929

SQL client run command SQL 客户端运行命令

bin/sql-client.sh embedded --jar  flink-sql-connector-kafka_2.11-1.12.0.jar  --jar flink-sql-avro-confluent-registry-1.12.0.jar

Flink cluster command Flink 集群命令

bin/start-cluster.sh

How to pass this java.security.auth.login.config and other system properties (that I'm setting in the above java code snippet), for SQL client?如何为 Z9778840A0100CB30C982876741B0 客户端传递此java.security.auth.login.config和其他系统属性(我在上面的 java 代码片段中设置)?

flink-conf.yaml flink-conf.yaml

security.kerberos.login.use-ticket-cache: true
security.kerberos.login.principal: XXXXX@HADOOP.COM
security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: /path/to/kafka.keytab
security.kerberos.login.principal: XXXX@HADOOP.COM
security.kerberos.login.contexts: Client,KafkaClient

I haven't really tested whether this solution is feasible, you can try it out, hope it will help you.我还没有真正测试过这个方案是否可行,你可以试试看,希望对你有帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM