![](/img/trans.png)
[英]auto-configure ClientRegistrationRepository in a non-web Spring Boot application
[英]Injecting properties to spring boot auto-configure classes in a non-spring managed web application
我使用 Spring Boot 創建了一個用作 kafka 客戶端的庫,基本上庫中只有類,每個類都使用@SpringBootConfiguration
和@EnableAutoConfiguration
注釋進行注釋。
@Slf4j
@SpringBootConfiguration
@EnableAutoConfiguration
public class KafkaHandlerConfiguration {
...
}
和
@Service
interface SwiftalkKafkaGateway {
...
}
我創建了一個 jar 及其依賴項,這個 JAR 將通過 CDI 在 Java EE webapp 中使用。 我將通過此代碼在 CDI 上下文中獲取 bean
@Singleton
@ApplicationScoped
class SwiftalkAnnotatedSpringContextLoader {
private final AnnotationConfigApplicationContext springContext;
SwiftalkAnnotatedSpringContextLoader() {
springContext = new AnnotationConfigApplicationContext();
springContext.scan("com.digite.cloud.swiftalk");
springContext.refresh();
}
ApplicationContext getSwiftalkKafkaClientContext() {
return this.springContext;
}
}
如何傳遞 Spring 引導自動配置需要啟動 bean 的屬性? 我有spring.kafka
屬性組和自定義屬性,它們通過@Value
中的KafkaHandlerConfiguration
注釋注入
@Value("${digite.swiftalk.kafka.executor.core-pool-size:10}")
private Integer corePoolSize;
@Value("${digite.swiftalk.kafka.executor.max-pool-size:20}")
private Integer maxPoolSize;
@Value("${digite.swiftalk.kafka.executor.queue-capacity:100}")
private Integer queueCapacity;
和
"spring.kafka.producer.properties.max.block.ms=1000",
"spring.kafka.producer.bootstrap-servers=localhost:9999",
"spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer",
"spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer",
使用ConfigurableEnvironment
和MutablePropertySource
對我有用; 這是我將環境加載到上下文中的方式
@Singleton
@ApplicationScoped
class SwiftalkAnnotatedSpringContextLoader {
private final AnnotationConfigApplicationContext springContext;
SwiftalkAnnotatedSpringContextLoader() throws IOException {
springContext = new AnnotationConfigApplicationContext();
ConfigurableEnvironment environment = new StandardEnvironment();
MutablePropertySources propertySources = environment.getPropertySources();
Properties appProps = new Properties();
appProps.load(this.getClass().getClassLoader().getResourceAsStream("spring-config.properties"));
propertySources.addFirst(new PropertySource<Properties>("spring-properties", appProps) {
@Override
public Object getProperty(String name) {
return appProps.getProperty(name);
}
});
springContext.setEnvironment(environment);
springContext.scan("com.digite.cloud.swiftalk");
springContext.refresh();
}
ApplicationContext getSwiftalkKafkaClientContext() {
return this.springContext;
}
}
在src/test/resources
中添加了一個文件
spring.data.mongodb.database=embedded
spring.data.mongodb.port=12345
spring.data.mongodb.host=localhost
spring.kafka.producer.properties.max.block.ms=2000
spring.kafka.producer.bootstrap-servers=localhost:19092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
digite.swiftalk.kafka.upstream-type-header=UPSTREAM-TYPE
digite.swiftalk.kafka.upstream-instance-header=INSTANCE-HEADER
digite.swiftalk.kafka.message-key-header=MESSAGE-KEY-HEADER
digite.swiftalk.kafka.executor.core-pool-size=20
digite.swiftalk.kafka.executor.max-pool-size=50
digite.swiftalk.kafka.executor.queue-capacity=1000
和測試
@Test
void testLoadsSpringApplicationContext() throws IOException {
SwiftalkAnnotatedSpringContextLoader loader = new SwiftalkAnnotatedSpringContextLoader();
SwiftalkKafkaGateway kafkaGateway = loader.getSwiftalkKafkaClientContext().getBean(SwiftalkKafkaGateway.class);
assertNotNull(kafkaGateway);
ThreadPoolTaskExecutor asyncExecutor = loader.getSwiftalkKafkaClientContext().getBean(
ThreadPoolTaskExecutor.class);
Assertions.assertTrue(asyncExecutor.getCorePoolSize() == 20);
}
在 spring 引導庫中corePoolSize
的默認值為 10
@Value("${digite.swiftalk.kafka.executor.core-pool-size:10}")
private Integer corePoolSize;
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.