繁体   English   中英

带有 Spring Boot 的 Heroku Apache Kafka 配置

[英]Heroku Apache Kafka Configuration with Spring Boot

我发现了很多与使用 spring boot 配置 apache kafka 相关的示例项目,我尝试了其中的一些,它在我的 Windows 上运行良好,但是当我尝试在 heroku 上运行它们时,它们在与 apache 连接时给我 SSL 连接错误heroku 上的 kafka。

这是我的生产者和消费者的配置类bean

@Bean
public ProducerFactory<String, String> producerFactory() {
    Map<String, Object> config = new HashMap<>();

    config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "URL");
    config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

    return new DefaultKafkaProducerFactory<>(config);
}

@Bean
public ConsumerFactory<String, String> consumerFactory() {
    Map<String, Object> config = new HashMap<>();

    config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "URL");
    config.put(ConsumerConfig.GROUP_ID_CONFIG, "xyz");
    config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
    config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
    return new DefaultKafkaConsumerFactory<>(config);
}

@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
    ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory();
    factory.setConsumerFactory(consumerFactory());
    return factory;
}


@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
    return new KafkaTemplate<>(producerFactory());
}

您需要按照Heroku Kafka 文档中所述配置信任库。

使用env-keystore的示例可能如下所示:

EnvKeyStore envTrustStore = EnvKeyStore.createWithRandomPassword("KAFKA_TRUSTED_CERT");
EnvKeyStore envKeyStore = EnvKeyStore.createWithRandomPassword("KAFKA_CLIENT_CERT_KEY", "KAFKA_CLIENT_CERT");

File trustStore = envTrustStore.storeTemp();
File keyStore = envKeyStore.storeTemp();

properties.put(SslConfigs.SSL_TRUSTSTORE_TYPE_CONFIG, envTrustStore.type());
properties.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, trustStore.getAbsolutePath());
properties.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, envTrustStore.password());
properties.put(SslConfigs.SSL_KEYSTORE_TYPE_CONFIG, envKeyStore.type());
properties.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, keyStore.getAbsolutePath());
properties.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, envKeyStore.password());

有关完整示例,请参阅此 Github 存储库

我将 spring boot 的依赖项升级到 2.2.1.RELEASE,将 apache kafka 升级到 2.3.3.RELEASE 并更新了我的配置类,如下所示,它在 heroku 上使用 apache kafka 成功配置

private Map<String, Object> buildDefaults() {
Map<String, Object> properties = new HashMap<>();
List<String> hostPorts = Lists.newArrayList();

for (String url : Splitter.on(",").split(checkNotNull(getenv("KAFKA_URL")))) { 
    try {
    URI uri = new URI(url);
    hostPorts.add(format("%s:%d", uri.getHost(), uri.getPort()));

    switch (uri.getScheme()) {
        case "kafka":
        properties.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "PLAINTEXT");
        break;
        case "kafka+ssl":
        properties.put(CommonClientConfigs.SECURITY_PROTOCOL_CONFIG, "SSL");

        try {
            EnvKeyStore envTrustStore = EnvKeyStore.createWithRandomPassword("KAFKA_TRUSTED_CERT");
            EnvKeyStore envKeyStore = EnvKeyStore.createWithRandomPassword("KAFKA_CLIENT_CERT_KEY", "KAFKA_CLIENT_CERT");

            File trustStore = envTrustStore.storeTemp();
            File keyStore = envKeyStore.storeTemp();

            properties.put(SslConfigs.SSL_TRUSTSTORE_TYPE_CONFIG, envTrustStore.type());
            properties.put(SslConfigs.SSL_TRUSTSTORE_LOCATION_CONFIG, trustStore.getAbsolutePath());
            properties.put(SslConfigs.SSL_TRUSTSTORE_PASSWORD_CONFIG, envTrustStore.password());
            properties.put(SslConfigs.SSL_KEYSTORE_TYPE_CONFIG, envKeyStore.type());
            properties.put(SslConfigs.SSL_KEYSTORE_LOCATION_CONFIG, keyStore.getAbsolutePath());
            properties.put(SslConfigs.SSL_KEYSTORE_PASSWORD_CONFIG, envKeyStore.password());
            properties.put(SslConfigs.SSL_ENDPOINT_IDENTIFICATION_ALGORITHM_CONFIG, "");
        } catch (Exception e) {
            throw new RuntimeException("There was a problem creating the Kafka key stores", e);
        }
        break;
        default:
        throw new IllegalArgumentException(format("unknown scheme; %s", uri.getScheme()));
    }
    } catch (URISyntaxException e) {
    throw new RuntimeException(e);
    }
}

properties.put(CommonClientConfigs.BOOTSTRAP_SERVERS_CONFIG, Joiner.on(",").join(hostPorts));
return properties;
}

@Bean
public ProducerFactory<String, String> producerFactory() {
Map<String, Object> config = buildDefaults();

config.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
config.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);

return new DefaultKafkaProducerFactory<>(config);
}

@Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> config = buildDefaults();
config.put(ConsumerConfig.GROUP_ID_CONFIG, "xyz");
config.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
config.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
return new DefaultKafkaConsumerFactory<>(config);
}

@Bean
public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory();
factory.setConsumerFactory(consumerFactory());
return factory;
}

@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}

}

Kafka 很有可能通过 TLS 进行 SASL (JAAS) 身份验证。 请仔细检查并添加一些额外的配置属性。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM