簡體   English   中英

如何使用注釋在spring批處理中運行多個作業

[英]How to run multiple jobs in spring batch using annotations

我正在使用Spring Boot + Spring Batch(注釋),遇到了我必須運行2個作業的場景。

我有需要使用spring批處理更新的Employee和Salary記錄。 我已經按照本教程為Employee和Salary對象的Spring-batch入​​門教程配置了BatchConiguration類,分別命名為BatchConfigurationEmployee和BatchConfigurationSalary。

我已經按照上面提到的教程定義了ItemReaderItemProcessorItemWriterJob

當我啟動我的Spring Boot應用程序時,我想要運行兩個BatchConfigured類。 我怎樣才能做到這一點

********* BatchConfigurationEmployee.java *************

@Configuration
@EnableBatchProcessing
public class BatchConfigurationEmployee {
    public ItemReader<employee> reader() {
        return new EmployeeItemReader();
    }

    @Bean
    public ItemProcessor<Employee, Employee> processor() {
        return new EmployeeItemProcessor();
    }

    @Bean   
    public Job Employee(JobBuilderFactory jobs, Step s1) {
        return jobs.get("Employee")
                .incrementer(new RunIdIncrementer())
                .flow(s1)
                .end()
                .build();
    }

    @Bean
    public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader<Employee> reader,
                    ItemProcessor<Employee, Employee> processor) {
        return stepBuilderFactory.get("step1")
                .<Employee, Employee> chunk(1)
                .reader(reader)
                .processor(processor)
                .build();
    }
}

薪資等級在這里

@Configuration
@EnableBatchProcessing
public class BatchConfigurationSalary {
    public ItemReader<Salary> reader() {
        return new SalaryItemReader();
    }

    @Bean
    public ItemProcessor<Salary, Salary> processor() {
        return new SalaryItemProcessor();
    }

    @Bean
    public Job salary(JobBuilderFactory jobs, Step s1) {
        return jobs.get("Salary")
                .incrementer(new RunIdIncrementer())
                .flow(s1)
                .end()
                .build();
    }

    @Bean
    public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader<Salary> reader,
                    ItemProcessor<Salary, Salary> processor) {
        return stepBuilderFactory.get("step1")
                .<Salary, Salary> chunk(1)
                .reader(reader)
                .processor(processor)
                .build();
    }
}

Bean的名稱在整個Spring Context中必須是唯一的。

在這兩個作業中,您使用相同的方法名實例化讀取器,寫入器和處理器。 methodname是用於在上下文中標識bean的名稱。

在兩個作業定義中,您都有reader(),writer()和processor()。 他們會互相覆蓋。 給他們一些唯一的名稱,如readerEmployee(),readerSalary()等。

那應該可以解決你的問題。

你的作業沒有用@Bean注釋,所以spring-context不知道它們。

看看JobLauncherCommandLineRunner類。 將注入實現Job接口的SpringContext中的所有Bean。 找到的所有作業都將被執行。 (這發生在JobLauncherCommandLineRunner中的方法executeLocalJobs中)

如果出於某種原因,你不想在上下文中將它們作為bean,那么你必須使用jobregistry注冊你的作業。(JobLauncherCommandLineRunner的執行registeredJobs的方法將負責啟動已注冊的作業)

順便說一句,你可以控制財產

spring.batch.job.names= # Comma-separated list of job names to execute on startup (For instance
 `job1,job2`). By default, all Jobs found in the context are executed.

哪些工作應該啟動。

我覺得這也是運行多個喬布斯的好方法。

我正在使用Job Launcher來配置和執行作業以及獨立的commandLineRunner實現來運行它們。 訂購它們以確保它們按要求順序執行

為大帖子道歉,但我想清楚地說明使用多個命令行運行器的JobLauncher配置可以實現什么

這是我當前的BeanConfiguration

@Configuration
public class BeanConfiguration {

    @Autowired
    DataSource dataSource;

    @Autowired
    PlatformTransactionManager transactionManager;

    @Bean(name="jobOperator")
     public JobOperator jobOperator(JobExplorer jobExplorer,

                                    JobRegistry jobRegistry) throws Exception {

            SimpleJobOperator jobOperator = new SimpleJobOperator();

            jobOperator.setJobExplorer(jobExplorer);
            jobOperator.setJobRepository(createJobRepository());
            jobOperator.setJobRegistry(jobRegistry);
            jobOperator.setJobLauncher(jobLauncher());

            return jobOperator;
     }

    /**
     * Configure joblaucnher to set the execution to be done asycn
     * Using the ThreadPoolTaskExecutor
     * @return
     * @throws Exception
     */
    @Bean
    public JobLauncher jobLauncher() throws Exception {
            SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
            jobLauncher.setJobRepository(createJobRepository());
            jobLauncher.setTaskExecutor(taskExecutor());
            jobLauncher.afterPropertiesSet();
            return jobLauncher;
    }

    // Read the datasource and set in the job repo
    protected JobRepository createJobRepository() throws Exception {
        JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
        factory.setDataSource(dataSource);
        factory.setTransactionManager(transactionManager);
        factory.setIsolationLevelForCreate("ISOLATION_SERIALIZABLE");
        //factory.setTablePrefix("BATCH_");
        factory.setMaxVarCharLength(10000);
        return factory.getObject();
    }

    @Bean
    public RestTemplateBuilder restTemplateBuilder() {
     return new RestTemplateBuilder().additionalInterceptors(new CustomRestTemplateLoggerInterceptor());
    }

    @Bean(name=AppConstants.JOB_DECIDER_BEAN_NAME_EMAIL_INIT)
    public JobExecutionDecider jobDecider() {
        return new EmailInitJobExecutionDecider();
    }

    @Bean
    public ThreadPoolTaskExecutor taskExecutor() {
    ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
    taskExecutor.setCorePoolSize(15);
    taskExecutor.setMaxPoolSize(20);
    taskExecutor.setQueueCapacity(30);
    return taskExecutor;
}
}

我已經設置了數據庫來保存postgre中的作業exectuion詳細信息,因此DatabaseConfiguration看起來像這樣(兩個不同的bean用於兩個不同的配置文件-env)

@Configuration公共類DatasourceConfiguration實現EnvironmentAware {

private Environment env;

@Bean
@Qualifier(AppConstants.DB_BEAN)
@Profile("dev")
public DataSource getDataSource() {
    HikariDataSource ds = new HikariDataSource();

    boolean isAutoCommitEnabled = env.getProperty("spring.datasource.hikari.auto-commit") != null ? Boolean.parseBoolean(env.getProperty("spring.datasource.hikari.auto-commit")):false;
    ds.setAutoCommit(isAutoCommitEnabled);
    // Connection test query is for legacy connections
    //ds.setConnectionInitSql(env.getProperty("spring.datasource.hikari.connection-test-query"));
    ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
    ds.setDriverClassName(env.getProperty("spring.datasource.driver-class-name"));
    long timeout = env.getProperty("spring.datasource.hikari.idleTimeout") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.idleTimeout")): 40000;
    ds.setIdleTimeout(timeout);
    long maxLifeTime = env.getProperty("spring.datasource.hikari.maxLifetime") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.maxLifetime")): 1800000 ;
    ds.setMaxLifetime(maxLifeTime);
    ds.setJdbcUrl(env.getProperty("spring.datasource.url"));
    ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
    ds.setUsername(env.getProperty("spring.datasource.username"));
    ds.setPassword(env.getProperty("spring.datasource.password"));
    int poolSize = env.getProperty("spring.datasource.hikari.maximum-pool-size") != null ? Integer.parseInt(env.getProperty("spring.datasource.hikari.maximum-pool-size")): 10;
    ds.setMaximumPoolSize(poolSize);

    return ds;
}

@Bean
@Qualifier(AppConstants.DB_PROD_BEAN)
@Profile("prod")

public DataSource getProdDatabase() {
    HikariDataSource ds = new HikariDataSource();

    boolean isAutoCommitEnabled = env.getProperty("spring.datasource.hikari.auto-commit") != null ? Boolean.parseBoolean(env.getProperty("spring.datasource.hikari.auto-commit")):false;
    ds.setAutoCommit(isAutoCommitEnabled);
    // Connection test query is for legacy connections
    //ds.setConnectionInitSql(env.getProperty("spring.datasource.hikari.connection-test-query"));
    ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
    ds.setDriverClassName(env.getProperty("spring.datasource.driver-class-name"));
    long timeout = env.getProperty("spring.datasource.hikari.idleTimeout") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.idleTimeout")): 40000;
    ds.setIdleTimeout(timeout);
    long maxLifeTime = env.getProperty("spring.datasource.hikari.maxLifetime") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.maxLifetime")): 1800000 ;
    ds.setMaxLifetime(maxLifeTime);
    ds.setJdbcUrl(env.getProperty("spring.datasource.url"));
    ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
    ds.setUsername(env.getProperty("spring.datasource.username"));
    ds.setPassword(env.getProperty("spring.datasource.password"));
    int poolSize = env.getProperty("spring.datasource.hikari.maximum-pool-size") != null ? Integer.parseInt(env.getProperty("spring.datasource.hikari.maximum-pool-size")): 10;
    ds.setMaximumPoolSize(poolSize);

    return ds;
}

public void setEnvironment(Environment environment) {
    // TODO Auto-generated method stub
    this.env = environment;
}

}

確保初始應用程序啟動程序捕獲應用程序執行,這將在作業執行終止(失敗或完成)后返回,以便您可以正常關閉jvm。 其他使用joblauncher使jvm在所有作業完成后仍然存活

@SpringBootApplication
@ComponentScan(basePackages="com.XXXX.Feedback_File_Processing.*")
@EnableBatchProcessing
public class FeedbackFileProcessingApp 
{
    public static void main(String[] args) throws Exception {
        ApplicationContext appContext = SpringApplication.run(FeedbackFileProcessingApp.class, args);
        // The batch job has finished by this point because the 
        //   ApplicationContext is not 'ready' until the job is finished
        // Also, use System.exit to force the Java process to finish with the exit code returned from the Spring App
        System.exit(SpringApplication.exit(appContext));
    }

}

.............等等,你可以配置你自己的決策者,你自己的工作/步驟如上所述的兩個不同的配置,如下所示,並在命令行跑步者中單獨使用(因為帖子是越來越大,我正在給出工作和命令行運行的詳細信息)

這是兩個工作

@Configuration
public class DefferalJobConfiguration {

    @Autowired
    JobLauncher joblauncher;

    @Autowired
    private JobBuilderFactory jobFactory;

    @Autowired
    private StepBuilderFactory stepFactory;

    @Bean
    @StepScope
    public Tasklet newSampleTasklet() {
        return ((stepExecution, chunkContext) -> {
            System.out.println("execution of step after flow");
            return RepeatStatus.FINISHED;
        });
    }

    @Bean
    public Step sampleStep() {
        return stepFactory.get("sampleStep").listener(new CustomStepExecutionListener())
                .tasklet(newSampleTasklet()).build();
    }

    @Autowired
    @Qualifier(AppConstants.FLOW_BEAN_NAME_EMAIL_INITIATION)
    private Flow emailInitFlow;

    @Autowired
    @Qualifier(AppConstants.JOB_DECIDER_BEAN_NAME_EMAIL_INIT)
    private JobExecutionDecider jobDecider;

    @Autowired
    @Qualifier(AppConstants.STEP_BEAN_NAME_ITEMREADER_FETCH_DEFERRAL_CONFIG)
    private Step deferralConfigStep;

    @Bean(name=AppConstants.JOB_BEAN_NAME_DEFERRAL)
    public Job deferralJob() {
        return jobFactory.get(AppConstants.JOB_NAME_DEFERRAL)
                .start(emailInitFlow)
                .on("COMPLETED").to(sampleStep())
                .next(jobDecider).on("COMPLETED").to(deferralConfigStep)
                .on("FAILED").fail()
                .end().build();


    }
}



@Configuration
public class TestFlowJobConfiguration {

    @Autowired
    private JobBuilderFactory jobFactory;

    @Autowired
    @Qualifier("testFlow")
    private Flow testFlow;

    @Bean(name = "testFlowJob")
    public Job testFlowJob() {

        return jobFactory.get("testFlowJob").start(testFlow).end().build();
    }
}

以下是命令行運行程序(我確保第一個作業在第二個作業初始化之前完成,但完全由用戶根據不同的策略並行執行)

@Component
@Order(1)
public class DeferralCommandLineRunner implements CommandLineRunner, EnvironmentAware{
    // If the jobLauncher is not used, then by default jobs are launched using SimpleJobLauncher
    //  with default configuration(assumption)
    // hence modified the jobLauncher with vales set in BeanConfig
    // of spring batch
    private Environment env;

    @Autowired
    JobLauncher jobLauncher;

    @Autowired
    @Qualifier(AppConstants.JOB_BEAN_NAME_DEFERRAL)
    Job deferralJob;

    @Override
    public void run(String... args) throws Exception {
        // TODO Auto-generated method stub
        JobParameters jobparams = new JobParametersBuilder()
                .addString("run.time", LocalDateTime.now().
                        format(DateTimeFormatter.ofPattern(AppConstants.JOB_DATE_FORMATTER_PATTERN)).toString())
                .addString("instance.name", 
                        (deferralJob.getName() != null) ?deferralJob.getName()+'-'+UUID.randomUUID().toString() :
                            UUID.randomUUID().toString())
                .toJobParameters();
        jobLauncher.run(deferralJob, jobparams);
    }

    @Override
    public void setEnvironment(Environment environment) {
        // TODO Auto-generated method stub
        this.env = environment;
    }

}



@Component
@Order(2)
public class TestJobCommandLineRunner implements CommandLineRunner {

    @Autowired
    JobLauncher jobLauncher;

    @Autowired
    @Qualifier("testFlowJob")
    Job testjob;

    @Autowired
    @Qualifier("jobOperator")
    JobOperator operator;

    @Override
    public void run(String... args) throws Exception {
        // TODO Auto-generated method stub
        JobParameters jobParam = new JobParametersBuilder().addString("name", UUID.randomUUID().toString())
                .toJobParameters();
        System.out.println(operator.getJobNames());
        try {
            Set<Long> deferralExecutionIds = operator.getRunningExecutions(AppConstants.JOB_NAME_DEFERRAL);
            System.out.println("deferralExceutuibuds:" + deferralExecutionIds);

            operator.stop(deferralExecutionIds.iterator().next());

        } catch (NoSuchJobException | NoSuchJobExecutionException | JobExecutionNotRunningException e) {
            // just add a logging here
            System.out.println("exception caught:" + e.getMessage());
        }
        jobLauncher.run(testjob, jobParam);
    }

}

希望這能完整地了解如何完成它。 我使用的是spring-boot-starter-batch:jar:2.0.0.RELEASE

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM