简体   繁体   English

如何将 Spring Cloud Stream Functional Beans 连接到 Kafka Binder?

[英]How do I Connect Spring Cloud Stream Functional Beans to a Kafka Binder?

I'm using the Spring Cloud Streams Documentation to try and work out how to connect my micro service to Kafka via the binder already downloaded in Gradle.我正在使用Spring Cloud Streams 文档来尝试确定如何通过已在 Gradle 中下载的活页夹将我的微服务连接到 Kafka。 I've tried creating a simple @Bean Function<String, String>() method within my Spring Boot Application class and have verified that it is able to talk to Kafka by using the command line to interact with the uppercase-in-0 and uppercase-out-0 topics, as is described in the beginning of the documentation confirming that the application is able to communicate with Kafka.我尝试在 Spring Boot Application 类中创建一个简单的@Bean Function<String, String>()方法,并验证它能够通过使用命令行与uppercase-in-0uppercase-out-0主题,如文档开头所述,确认应用程序能够与 Kafka 通信。 At this point I attempted to create the following class with the expectation that it would load via auto discovery:在这一点上,我尝试创建以下类,期望它可以通过自动发现加载:

package com.yuknis.loggingconsumer;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class LoggingConsumerApplication {

    public static void main(String[] args) {
        SpringApplication.run(LoggingConsumerApplication.class, args);
    }

}

package com.yuknis.loggingconsumer.functions;

import java.util.function.Function;

public class CharCounter implements Function<String, Integer> {

    /**
     * Applies this function to the given argument.
     *
     * @param s the function argument
     * @return the function result
     */
    @Override
    public Integer apply(String s) {
        return s.length();
    }

}

With the application.properties files as such:使用application.properties文件:

spring.cloud.function.scan.packages:com.yuknis.loggingconsumer.functions

I'm not 100% sure what should happen now, but I'm assuming that it should see the class and automatically create a charcounter-out-0 and charcounter-in-0 topic which I could consume and publish to, with the data in those topics going through that function.我不是 100% 确定现在会发生什么,但我假设它应该看到类并自动创建一个charcounter-out-0charcounter-in-0主题,我可以使用数据将其发布到该主题在通过该功能的那些主题中。 That's isn't what's happening.这不是正在发生的事情。 What might I be missing?我可能缺少什么? Is this class supposed to create the topic the same way that the @Bean would?这个类是否应该以与@Bean相同的方式创建主题?

Even though each of the functions are loaded with spring.cloud.function.scan.packages set to a package and spring.cloud.function.scan.enabled set to true , it still doesn't create the topics.即使每个函数都加载了spring.cloud.function.scan.packages设置为包并且spring.cloud.function.scan.enabled设置为true ,它仍然不会创建主题。 You'll still need to set spring.cloud.function.scan.definition to the Function , Consumer , or Supplier you'd like to have communicate with Kafka like so:您仍然需要将spring.cloud.function.scan.definition设置为您希望与 Kafka 进行通信的FunctionConsumerSupplier ,如下所示:

spring.cloud.function:
  scan:
    enabled: true
    packages: com.yuknis.loggingconsumer.functions
  definition: charCounter;lowercase;uppercase

After that, it will create the charCounter-in-0 and charCounter-out-0 topics, which can be mapped if necessary with the spring.cloud.function.charCounter-in-0 or spring.cloud.function.charCounter-out-0 expression property.之后,它将创建charCounter-in-0charCounter-out-0主题,如果需要,可以使用spring.cloud.function.charCounter-in-0spring.cloud.function.charCounter-out-0映射它们spring.cloud.function.charCounter-out-0表达式属性。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 我如何从 Spring Cloud Stream Kafka Binder 中的偏移量获取消息? - How do i get the messages from the offset in Spring Cloud Stream Kafka Binder? 如何使用 spring-cloud-stream-binder-kafka-streams:3.1.1 中的功能方法检索/设置 header - How to retrieve/set header using functional approach in spring-cloud-stream-binder-kafka-streams:3.1.1 如何使用 Spring Cloud Stream 和 Kafka Streams Binder 暂停(打开/关闭)stream 处理? - How can I pause (turn on/off) stream processing w/ Spring Cloud Stream & Kafka Streams Binder? 如何处理 Spring 云 stream kafka 流活页夹中的序列化错误? - How to handle Serialization error in Spring cloud stream kafka streams binder? 空指针:带有活页夹 kafka 的 Spring Cloud 流 - Null pointer: Spring cloud stream with binder kafka Spring Cloud Stream Kafka活页夹压缩 - Spring Cloud Stream Kafka binder compression Spring-Cloud-Stream-Kafka-Binder 函数风格忽略自定义 De/Serializer 和/或 useNativeEncoding? - Spring-Cloud-Stream-Kafka-Binder functional style ignores custom De/Serializer and/or useNativeEncoding? Kafka中的JSON错误-Spring Cloud Stream Kafka Binder - Bad JSON in Kafka - Spring Cloud Stream Kafka Binder Spring Cloud Stream Kafka Binder 运行时配置更新 - Spring Cloud Stream Kafka Binder Configuration update at runtime 使用 spring-cloud-stream-binder-kafka 将 GlobalStateStore 绑定到处理器中 - Binding GlobalStateStore into Processor with spring-cloud-stream-binder-kafka
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM