简体   繁体   English

Spring Batch 3.0:StepExecutionListener 用于分区 Step 并将执行上下文值级联到分区作业

[英]Spring Batch 3.0 : StepExecutionListener for a partitioned Step and cascading of execution context values to the partitioned job

Given a Spring Batch Job that uses partitioning:给定一个使用分区的Spring 批处理作业:

<job id="reportingJob" xmlns="http://www.springframework.org/schema/batch">
        <batch:listeners>
            <batch:listener ref="reportingJobExecutionListenerr" />
        </batch:listeners>
        <batch:step id="reportingMasterStep">
            <partition step="reportingSlaveStep"
                partitioner="reportingPartitioner">
                <batch:handler grid-size="10" task-executor="taskExecutor" />
            </partition>
        </batch:step>
</job>

And reportingSlaveStep defined as:reportingSlaveStep定义为:

<step id="reportingSlaveStep" xmlns="http://www.springframework.org/schema/batch">
        <job ref="reportingSlaveJob" />
</step>

And reportingSlaveJob defined as:reportingSlaveJob定义为:

    <job id="reportingSlaveJob" xmlns="http://www.springframework.org/schema/batch">
        <batch:listeners>
            <batch:listener ref="reportsOutputListener" />
        </batch:listeners>
        <batch:split id="reportsCreationSplit"
            task-executor="taskExecutor">
            <batch:flow>
                <batch:step id="basicReportStep">
                    <tasklet throttle-limit="5" task-executor="taskExecutor">
                        <batch:chunk reader="basicReportReader"
                            writer="basicReportWriter" commit-interval="500" />
                    </tasklet>
                </batch:step>
            </batch:flow>
            <batch:flow>
                <batch:step id="advancedReportStep">
                    <tasklet throttle-limit="5" task-executor="taskExecutor">
                        <batch:chunk reader="advancedReportDataReader" writer="advancedReportWriter"
                            commit-interval="500" />
                    </tasklet>
                </batch:step>
            </batch:flow>
       </batch:split>
     </job>

I now have 2 questions:我现在有两个问题:

  1. I want a new reportsOutputListener instance to be created for each partition.我想为每个分区创建一个新的reportsOutputListener实例。 Can I achieve this by making reportsOutputListener a Step scoped bean?我可以通过将reportsOutputListener设置为Step范围的 bean 来实现这一点吗?
  2. I want to be able to access the same jobExecutionContext created for reportingJob to be accessible in reportingSlaveJob .我希望能够访问为reportingJob创建的相同jobExecutionContext ,以便在reportingSlaveJob中访问。 Do I need to any special handling for this or is the same jobExecutionContext instance in reportingJob is used by the reportingSlaveStepSlaveJob as well?我是否需要对此进行任何特殊处理,或者reportingJob是否也使用了reportingSlaveStepSlaveJob中的同一个jobExecutionContext实例?
  3. EDIT : When I run the above job, at times I get an exception saying that the "A job execution for this job is already running" and other times I get a NullPointerException on MapExecutionContextDao.java:130 .编辑:当我运行上述作业时,有时我会收到一个异常,说“该作业的作业执行已经在运行”,有时我会在MapExecutionContextDao.java:130上收到NullPointerException

EDIT : Also note that for point 2, the slaveJob is unable to access the values added in the stepExecutionContext (access using #{stepExecutionContext['msbfBatchId']} in spring config xml) by the reportingPartitioner .编辑:另请注意,对于第 2 点, slaveJob无法访问 reportPartitioner 在stepExecutionContext中添加的值(使用 spring 配置 xml 中#{stepExecutionContext['msbfBatchId']} reportingPartitioner )。 The values in the stepExecutionContext against the key come out as null . stepExecutionContext中针对键的值显示为null

I want a new reportsOutputListener instance to be created for each partition.我想为每个分区创建一个新的 reportsOutputListener 实例。 Can I achieve this by making reportsOutputListener a Step scoped bean?我可以通过将 reportsOutputListener 设置为 Step 范围的 bean 来实现这一点吗?

The answer is Yes .答案是肯定的。 (as mentioned in the comments by Mahmoud Ben Hassine ) (如Mahmoud Ben Hassine的评论中所述)

I want to be able to access the same jobExecutionContext created for reportingJob to be accessible in reportingSlaveJob.我希望能够访问为reportingJob 创建的相同jobExecutionContext,以便在reportingSlaveJob 中访问。 Do I need to any special handling for this or is the same jobExecutionContext instance in reportingJob is used by the reportingSlaveStepSlaveJob as well?我是否需要对此进行任何特殊处理,或者reportingSlaveStepSlaveJob 是否也使用了reportingJob 中的同一个jobExecutionContext 实例?

The answer is No .答案是否定的。 I dug into the Spring Batch code and found that JobStep uses a JobParametersExtractor for copying the values from the stepExecutionContext to the JobParameters .我深入研究了Spring 批处理代码,发现JobStep使用JobParametersExtractor将值从stepExecutionContext复制到JobParameters This means that reportingSlaveJob can access these values from the JobParameters instead of the StepExecutionContext .这意味着reportingSlaveJob可以从JobParameters而不是StepExecutionContext访问这些值。 That said, for some reason, the DefaultJobParametersExtractor implementation in Srping Batch 3.0 doesn't seem to be copying the values to the jobParameters as expected.也就是说,出于某种原因, Srping Batch 3.0中的DefaultJobParametersExtractor实现似乎没有按预期将值复制到jobParameters I ended up writing the following cusom extractor:我最终编写了以下自定义提取器:

public class CustomJobParametersExtractor implements JobParametersExtractor {

    private Set<String> keys;

    public CustomJobParametersExtractor () {
        this.keys = new HashSet<>();
    }

    @Override
    public JobParameters getJobParameters(Job job, StepExecution stepExecution) {
        JobParametersBuilder builder = new JobParametersBuilder();
        Map<String, JobParameter> jobParameters = stepExecution.getJobParameters().getParameters();
        ExecutionContext stepExecutionContext = stepExecution.getExecutionContext();
        ExecutionContext jobExecutionContext = stepExecution.getJobExecution().getExecutionContext();

        // copy job parameters from parent job to delegate job
        for (String key : jobParameters.keySet()) {
            builder.addParameter(key, jobParameters.get(key));
        }

        // copy job/step context from parent job/step to delegate job
        for (String key : keys) {
            if (jobExecutionContext.containsKey(key)) {
                builder.addString(key, jobExecutionContext.getString(key));
            } else if (stepExecutionContext.containsKey(key)) {
                builder.addString(key, stepExecutionContext.getString(key));
            } else if (jobParameters.containsKey(key)) {
                builder.addString(key, (String) jobParameters.get(key).getValue());
            }
        }
        return builder.toJobParameters();
    }

    public void setKeys(String[] keys) {
        this.keys = new HashSet<>(Arrays.asList(keys));
    }

}

I can then use the above extractor in the reporting slave step as follows:然后我可以在报告从属步骤中使用上述提取器,如下所示:

<step id="reportingSlaveStep" xmlns="http://www.springframework.org/schema/batch">
        <job ref="reportingSlaveJob" job-parameters-extractor="customJobParametersExtractor"/>
</step>

where customJobParametersExtractor is a bean of type CustomJobParametersExtractor which is passed all the keys that we want to copy to the JobParameters of reportingSlaveJob .其中customJobParametersExtractorCustomJobParametersExtractor类型的 bean,它将我们要复制的所有键传递给reportingSlaveJobJobParameters

When I run the above job, at times I get an exception saying that the "A job execution for this job is already running" and other times I get a NullPointerException on MapExecutionContextDao.java:130当我运行上述作业时,有时我会收到一个异常说“该作业的作业执行已经在运行”,有时我会在 MapExecutionContextDao.java:130 上收到 NullPointerException

The reason this was happening was because WITHOUT my CustomJobParameterExtractor , the reportingSlaveJob was getting launched with empty JobParameters .发生这种情况的原因是因为没有我的CustomJobParameterExtractorreportingSlaveJob会以空JobParameters For Spring Batch to create a new job instance, the job parameters have to be different for each launch of the reportingSlaveJob .对于Spring 批处理创建新的作业实例,每次启动reportingSlaveJob的作业参数必须不同。 Using the CustomJobParameterExtractor fixed this issue as well.使用CustomJobParameterExtractor解决了这个问题。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Spring批处理:传播在分区步骤中遇到的异常(停止作业执行) - Spring batch : Propagate exception encountered in partitioned step (Stop job execution) Spring-batch 分区步骤重复处理 - Spring-batch partitioned step duplicated processing Spring Batch分区,如何在任何分区步骤引发异常后停止作业 - Spring Batch Partitioning, how to stop a Job once any of the partitioned step throws exception 从Spring Batch分区步骤的Writer启动Runnable - Launching a Runnable from Writer of Spring Batch partitioned step 分区的Spring Batch Step重复相同的成功从属StepExecutions - Partitioned Spring Batch Step repeats the same successful slave StepExecutions 分区作业完成后不能自行停止? 春批 - Partitioned Job can't stop by itself after finishing? Spring Batch 订购Spring Batch中的分区步骤 - Order partitioned steps in Spring Batch 如何从 Spring Batch Step 访问执行上下文? 错误:没有可用于作业范围的上下文持有者 - How to access execution context from a Spring Batch Step? Error: No context holder available for job scope Spring StepExecutionListener 中的批量事务 - Spring Batch Transactions in StepExecutionListener spring 未调用步骤之前的批处理,需要步骤执行上下文 - spring batch before step not getting called, need step execution context
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM