[英]How to pass arguments from slave steps to reader in Spring Batch?
I have a spring batch process that is reading data from a database.我有一个从数据库读取数据的 spring 批处理。 Basically what happens is I have a SQL query that needs to get data by a column (type) value.基本上发生的事情是我有一个 SQL 查询需要按列(类型)值获取数据。 That column has 50 different values.该列有 50 个不同的值。 So there are 50 queries and each is executed on a separate slave step.所以有 50 个查询,每个查询都在一个单独的从属步骤上执行。 But the query is building inside the Reader.但是查询是在阅读器内部构建的。 So I need to pass each type to the Reader to build the query and read data.所以我需要将每种类型传递给 Reader 来构建查询和读取数据。 I am using Partitioner
to separate the query with Offset
and Limit
.我正在使用Partitioner
将查询与Offset
和Limit
分开。
Here is the code I have,这是我的代码,
private Flow flow(List<Step> steps) {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(1);
return new FlowBuilder<SimpleFlow>("flow")
.split(taskExecutor).add(steps.stream().map(step -> new FlowBuilder<Flow>("flow_" + step.getName())
.start(step).build()).toArray(Flow[]::new)).build();
}
@Bean
public Job job() {
List<Step> masterSteps = TYPES.stream().map(this::masterStep).collect(Collectors.toList());
return jobBuilderFactory.get("job")
.incrementer(new RunIdIncrementer())
.start(flow(masterSteps))
.end()
.build();
}
@Bean
@SneakyThrows
public Step slaveStep(String type) {
return stepBuilderFactory.get("slaveStep")
.<User, User>chunk(100)
.reader(reader(type, 0, 0))
.writer(writer())
.build();
}
@Bean
@SneakyThrows
public Step masterStep(String type) {
return stepBuilderFactory.get("masterStep")
.partitioner(slaveStep(type).getName(), partitioner(0))
.step(slaveStep(type))
.gridSize(5)
.taskExecutor(executor)
.build();
}
@Bean
@StepScope
@SneakyThrows
public JdbcCursorItemReader<User> reader(String type,
@Value("#{stepExecutionContext['offset']}") Integer offset,
@Value("#{stepExecutionContext['limit']}") Integer limit) {
String query = MessageFormat.format(SELECT_QUERY, type, offset, limit); // Ex: SELECT * FROM users WHERE type = 'type' OFFSET 500 LIMIT 1000;
JdbcCursorItemReader<User> itemReader = new JdbcCursorItemReader<>();
itemReader.setSql(query);
itemReader.setDataSource(dataSource);
itemReader.setRowMapper(new UserMapper());
itemReader.afterPropertiesSet();
return itemReader;
}
@Bean
@StepScope
public ItemWriter<User> writer() {
return new Writer();
}
@Bean
@StepScope
public Partitioner partitioner(@Value("#{jobParameters['limit']}") int limit) {
return new Partitioner(limit);
}
The issue I am using is to reader()
method the type
value is not passing.我正在使用的问题是type
值未通过的reader()
方法。 And even when I am adding @Bean
annotation it is saying Could not autowire. No beans of 'String' type found.
即使当我添加@Bean
注释时,它也会说Could not autowire. No beans of 'String' type found.
Could not autowire. No beans of 'String' type found.
. . If I didn't put @Bean
offset
and limit
is always 0 because @Value
is not populating.如果我没有放置@Bean
offset
并且limit
总是0,因为@Value
没有填充。 Right now when I am executing the batch nothing happens inside reader because type is null.现在,当我执行批处理时,阅读器内部没有任何反应,因为类型是 null。 When I am hardcoding the value it is working.当我对它的工作值进行硬编码时。 So how can I fix this?那么我该如何解决这个问题呢? Thanks in advance.提前致谢。
If you are iterating every TYPE
and execute masterStep
, why don't you remove that TYPE
logic and instead you can SELECT * FROM table OFFSET? LIMIT?
如果您要迭代每个TYPE
并执行masterStep
,为什么不删除该TYPE
逻辑,而是可以SELECT * FROM table OFFSET? LIMIT?
SELECT * FROM table OFFSET? LIMIT?
and handle offset
and limit
inside Partitioner
?并在Partitioner
中处理offset
和limit
? And then your 5 threads will handle this.然后你的 5 个线程将处理这个问题。 If your final goal is to process every record in that table then you can simply use this without worrying about TYPE and executing them inside separate Step.如果您的最终目标是处理该表中的每条记录,那么您可以简单地使用它而无需担心 TYPE 并在单独的 Step 中执行它们。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.