简体   繁体   中英

Spring Batch 3.0 : StepExecutionListener for a partitioned Step and cascading of execution context values to the partitioned job

Given a Spring Batch Job that uses partitioning:

<job id="reportingJob" xmlns="http://www.springframework.org/schema/batch">
        <batch:listeners>
            <batch:listener ref="reportingJobExecutionListenerr" />
        </batch:listeners>
        <batch:step id="reportingMasterStep">
            <partition step="reportingSlaveStep"
                partitioner="reportingPartitioner">
                <batch:handler grid-size="10" task-executor="taskExecutor" />
            </partition>
        </batch:step>
</job>

And reportingSlaveStep defined as:

<step id="reportingSlaveStep" xmlns="http://www.springframework.org/schema/batch">
        <job ref="reportingSlaveJob" />
</step>

And reportingSlaveJob defined as:

    <job id="reportingSlaveJob" xmlns="http://www.springframework.org/schema/batch">
        <batch:listeners>
            <batch:listener ref="reportsOutputListener" />
        </batch:listeners>
        <batch:split id="reportsCreationSplit"
            task-executor="taskExecutor">
            <batch:flow>
                <batch:step id="basicReportStep">
                    <tasklet throttle-limit="5" task-executor="taskExecutor">
                        <batch:chunk reader="basicReportReader"
                            writer="basicReportWriter" commit-interval="500" />
                    </tasklet>
                </batch:step>
            </batch:flow>
            <batch:flow>
                <batch:step id="advancedReportStep">
                    <tasklet throttle-limit="5" task-executor="taskExecutor">
                        <batch:chunk reader="advancedReportDataReader" writer="advancedReportWriter"
                            commit-interval="500" />
                    </tasklet>
                </batch:step>
            </batch:flow>
       </batch:split>
     </job>

I now have 2 questions:

  1. I want a new reportsOutputListener instance to be created for each partition. Can I achieve this by making reportsOutputListener a Step scoped bean?
  2. I want to be able to access the same jobExecutionContext created for reportingJob to be accessible in reportingSlaveJob . Do I need to any special handling for this or is the same jobExecutionContext instance in reportingJob is used by the reportingSlaveStepSlaveJob as well?
  3. EDIT : When I run the above job, at times I get an exception saying that the "A job execution for this job is already running" and other times I get a NullPointerException on MapExecutionContextDao.java:130 .

EDIT : Also note that for point 2, the slaveJob is unable to access the values added in the stepExecutionContext (access using #{stepExecutionContext['msbfBatchId']} in spring config xml) by the reportingPartitioner . The values in the stepExecutionContext against the key come out as null .

I want a new reportsOutputListener instance to be created for each partition. Can I achieve this by making reportsOutputListener a Step scoped bean?

The answer is Yes . (as mentioned in the comments by Mahmoud Ben Hassine )

I want to be able to access the same jobExecutionContext created for reportingJob to be accessible in reportingSlaveJob. Do I need to any special handling for this or is the same jobExecutionContext instance in reportingJob is used by the reportingSlaveStepSlaveJob as well?

The answer is No . I dug into the Spring Batch code and found that JobStep uses a JobParametersExtractor for copying the values from the stepExecutionContext to the JobParameters . This means that reportingSlaveJob can access these values from the JobParameters instead of the StepExecutionContext . That said, for some reason, the DefaultJobParametersExtractor implementation in Srping Batch 3.0 doesn't seem to be copying the values to the jobParameters as expected. I ended up writing the following cusom extractor:

public class CustomJobParametersExtractor implements JobParametersExtractor {

    private Set<String> keys;

    public CustomJobParametersExtractor () {
        this.keys = new HashSet<>();
    }

    @Override
    public JobParameters getJobParameters(Job job, StepExecution stepExecution) {
        JobParametersBuilder builder = new JobParametersBuilder();
        Map<String, JobParameter> jobParameters = stepExecution.getJobParameters().getParameters();
        ExecutionContext stepExecutionContext = stepExecution.getExecutionContext();
        ExecutionContext jobExecutionContext = stepExecution.getJobExecution().getExecutionContext();

        // copy job parameters from parent job to delegate job
        for (String key : jobParameters.keySet()) {
            builder.addParameter(key, jobParameters.get(key));
        }

        // copy job/step context from parent job/step to delegate job
        for (String key : keys) {
            if (jobExecutionContext.containsKey(key)) {
                builder.addString(key, jobExecutionContext.getString(key));
            } else if (stepExecutionContext.containsKey(key)) {
                builder.addString(key, stepExecutionContext.getString(key));
            } else if (jobParameters.containsKey(key)) {
                builder.addString(key, (String) jobParameters.get(key).getValue());
            }
        }
        return builder.toJobParameters();
    }

    public void setKeys(String[] keys) {
        this.keys = new HashSet<>(Arrays.asList(keys));
    }

}

I can then use the above extractor in the reporting slave step as follows:

<step id="reportingSlaveStep" xmlns="http://www.springframework.org/schema/batch">
        <job ref="reportingSlaveJob" job-parameters-extractor="customJobParametersExtractor"/>
</step>

where customJobParametersExtractor is a bean of type CustomJobParametersExtractor which is passed all the keys that we want to copy to the JobParameters of reportingSlaveJob .

When I run the above job, at times I get an exception saying that the "A job execution for this job is already running" and other times I get a NullPointerException on MapExecutionContextDao.java:130

The reason this was happening was because WITHOUT my CustomJobParameterExtractor , the reportingSlaveJob was getting launched with empty JobParameters . For Spring Batch to create a new job instance, the job parameters have to be different for each launch of the reportingSlaveJob . Using the CustomJobParameterExtractor fixed this issue as well.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM