[英]Spring Batch 3.0 : StepExecutionListener for a partitioned Step and cascading of execution context values to the partitioned job
Given a Spring Batch Job that uses partitioning:给定一个使用分区的Spring 批处理作业:
<job id="reportingJob" xmlns="http://www.springframework.org/schema/batch">
<batch:listeners>
<batch:listener ref="reportingJobExecutionListenerr" />
</batch:listeners>
<batch:step id="reportingMasterStep">
<partition step="reportingSlaveStep"
partitioner="reportingPartitioner">
<batch:handler grid-size="10" task-executor="taskExecutor" />
</partition>
</batch:step>
</job>
And reportingSlaveStep
defined as:而
reportingSlaveStep
定义为:
<step id="reportingSlaveStep" xmlns="http://www.springframework.org/schema/batch">
<job ref="reportingSlaveJob" />
</step>
And reportingSlaveJob
defined as:而
reportingSlaveJob
定义为:
<job id="reportingSlaveJob" xmlns="http://www.springframework.org/schema/batch">
<batch:listeners>
<batch:listener ref="reportsOutputListener" />
</batch:listeners>
<batch:split id="reportsCreationSplit"
task-executor="taskExecutor">
<batch:flow>
<batch:step id="basicReportStep">
<tasklet throttle-limit="5" task-executor="taskExecutor">
<batch:chunk reader="basicReportReader"
writer="basicReportWriter" commit-interval="500" />
</tasklet>
</batch:step>
</batch:flow>
<batch:flow>
<batch:step id="advancedReportStep">
<tasklet throttle-limit="5" task-executor="taskExecutor">
<batch:chunk reader="advancedReportDataReader" writer="advancedReportWriter"
commit-interval="500" />
</tasklet>
</batch:step>
</batch:flow>
</batch:split>
</job>
I now have 2 questions:我现在有两个问题:
reportsOutputListener
instance to be created for each partition.reportsOutputListener
实例。 Can I achieve this by making reportsOutputListener
a Step
scoped bean?reportsOutputListener
设置为Step
范围的 bean 来实现这一点吗?jobExecutionContext
created for reportingJob
to be accessible in reportingSlaveJob
.reportingJob
创建的相同jobExecutionContext
,以便在reportingSlaveJob
中访问。 Do I need to any special handling for this or is the same jobExecutionContext
instance in reportingJob
is used by the reportingSlaveStepSlaveJob
as well?reportingJob
是否也使用了reportingSlaveStepSlaveJob
中的同一个jobExecutionContext
实例?NullPointerException
on MapExecutionContextDao.java:130
.MapExecutionContextDao.java:130
上收到NullPointerException
。 EDIT : Also note that for point 2, the slaveJob
is unable to access the values added in the stepExecutionContext
(access using #{stepExecutionContext['msbfBatchId']}
in spring config xml) by the reportingPartitioner
.编辑:另请注意,对于第 2 点,
slaveJob
无法访问 reportPartitioner 在stepExecutionContext
中添加的值(使用 spring 配置 xml 中#{stepExecutionContext['msbfBatchId']}
reportingPartitioner
)。 The values in the stepExecutionContext
against the key come out as null
. stepExecutionContext
中针对键的值显示为null
。
I want a new reportsOutputListener instance to be created for each partition.
我想为每个分区创建一个新的 reportsOutputListener 实例。 Can I achieve this by making reportsOutputListener a Step scoped bean?
我可以通过将 reportsOutputListener 设置为 Step 范围的 bean 来实现这一点吗?
The answer is Yes .答案是肯定的。 (as mentioned in the comments by Mahmoud Ben Hassine )
(如Mahmoud Ben Hassine的评论中所述)
I want to be able to access the same jobExecutionContext created for reportingJob to be accessible in reportingSlaveJob.
我希望能够访问为reportingJob 创建的相同jobExecutionContext,以便在reportingSlaveJob 中访问。 Do I need to any special handling for this or is the same jobExecutionContext instance in reportingJob is used by the reportingSlaveStepSlaveJob as well?
我是否需要对此进行任何特殊处理,或者reportingSlaveStepSlaveJob 是否也使用了reportingJob 中的同一个jobExecutionContext 实例?
The answer is No .答案是否定的。 I dug into the Spring Batch code and found that
JobStep
uses a JobParametersExtractor
for copying the values from the stepExecutionContext
to the JobParameters
.我深入研究了Spring 批处理代码,发现
JobStep
使用JobParametersExtractor
将值从stepExecutionContext
复制到JobParameters
。 This means that reportingSlaveJob
can access these values from the JobParameters
instead of the StepExecutionContext
.这意味着
reportingSlaveJob
可以从JobParameters
而不是StepExecutionContext
访问这些值。 That said, for some reason, the DefaultJobParametersExtractor
implementation in Srping Batch 3.0 doesn't seem to be copying the values to the jobParameters
as expected.也就是说,出于某种原因, Srping Batch 3.0中的
DefaultJobParametersExtractor
实现似乎没有按预期将值复制到jobParameters
。 I ended up writing the following cusom extractor:我最终编写了以下自定义提取器:
public class CustomJobParametersExtractor implements JobParametersExtractor {
private Set<String> keys;
public CustomJobParametersExtractor () {
this.keys = new HashSet<>();
}
@Override
public JobParameters getJobParameters(Job job, StepExecution stepExecution) {
JobParametersBuilder builder = new JobParametersBuilder();
Map<String, JobParameter> jobParameters = stepExecution.getJobParameters().getParameters();
ExecutionContext stepExecutionContext = stepExecution.getExecutionContext();
ExecutionContext jobExecutionContext = stepExecution.getJobExecution().getExecutionContext();
// copy job parameters from parent job to delegate job
for (String key : jobParameters.keySet()) {
builder.addParameter(key, jobParameters.get(key));
}
// copy job/step context from parent job/step to delegate job
for (String key : keys) {
if (jobExecutionContext.containsKey(key)) {
builder.addString(key, jobExecutionContext.getString(key));
} else if (stepExecutionContext.containsKey(key)) {
builder.addString(key, stepExecutionContext.getString(key));
} else if (jobParameters.containsKey(key)) {
builder.addString(key, (String) jobParameters.get(key).getValue());
}
}
return builder.toJobParameters();
}
public void setKeys(String[] keys) {
this.keys = new HashSet<>(Arrays.asList(keys));
}
}
I can then use the above extractor in the reporting slave step as follows:然后我可以在报告从属步骤中使用上述提取器,如下所示:
<step id="reportingSlaveStep" xmlns="http://www.springframework.org/schema/batch">
<job ref="reportingSlaveJob" job-parameters-extractor="customJobParametersExtractor"/>
</step>
where customJobParametersExtractor
is a bean of type CustomJobParametersExtractor
which is passed all the keys that we want to copy to the JobParameters
of reportingSlaveJob
.其中
customJobParametersExtractor
是CustomJobParametersExtractor
类型的 bean,它将我们要复制的所有键传递给reportingSlaveJob
的JobParameters
。
When I run the above job, at times I get an exception saying that the "A job execution for this job is already running" and other times I get a NullPointerException on MapExecutionContextDao.java:130
当我运行上述作业时,有时我会收到一个异常说“该作业的作业执行已经在运行”,有时我会在 MapExecutionContextDao.java:130 上收到 NullPointerException
The reason this was happening was because WITHOUT my CustomJobParameterExtractor
, the reportingSlaveJob
was getting launched with empty JobParameters
.发生这种情况的原因是因为没有我的
CustomJobParameterExtractor
, reportingSlaveJob
会以空JobParameters
。 For Spring Batch to create a new job instance, the job parameters have to be different for each launch of the reportingSlaveJob
.对于Spring 批处理创建新的作业实例,每次启动
reportingSlaveJob
的作业参数必须不同。 Using the CustomJobParameterExtractor
fixed this issue as well.使用
CustomJobParameterExtractor
解决了这个问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.