简体   繁体   中英

spring batch execution, how to avoid if file name is empty with FlatFileItemWriter from logging exception

I am using spring batch applicationwith reader,writer and processor. File name is passed from batchjob to writer which is in stepscope.When bean is initialized I could see exception in BATCH_STEP_EXECUTION table as below org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'scopedTarget.resWriter' defined in class path resource: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.batch.item.file.FlatFileItemWriter]: Factory method 'resWriter' threw exception; nested exception is java.lang.IllegalArgumentException: Path must not be null org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'scopedTarget.resWriter' defined in class path resource: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.batch.item.file.FlatFileItemWriter]: Factory method 'resWriter' threw exception; nested exception is java.lang.IllegalArgumentException: Path must not be null

Spring batch code

    @StepScope
    @Bean
    public FlatFileItemWriter<EntityObject> regulatedEntityWriter(@Value("#{jobParameters['fileName']}") String fileName){
        
        /*
           while bean is initialized fileName is empty and FlatFileItemWriter requries filename, then it throws  Path must not be null exeption
        */
        pretaFileName = fileName;
        FlatFileItemWriter<EntityObject> csvFileWriter = new FlatFileItemWriter<>();
        
            String exportFileHeader = "column1,column2,column3";
            StringHeaderWriter headerWriter = new StringHeaderWriter(exportFileHeader);
            csvFileWriter.setHeaderCallback(headerWriter);
            csvFileWriter.setShouldDeleteIfEmpty(true);
            
            CustomDelimitedLineAggregator<EntityObject> lineAggregator = new CustomDelimitedLineAggregator<>();
            BeanWrapperFieldExtractor<EntityObject> fieldExtractor = new BeanWrapperFieldExtractor<>();
            fieldExtractor.setNames(new String[]{"column1", "column2", "column3"});
            lineAggregator.setFieldExtractor(fieldExtractor);
            csvFileWriter.setLineAggregator(lineAggregator);
            csvFileWriter.setEncoding(encodingType);
            csvFileWriter.setResource(new FileSystemResource(fileName));
        return csvFileWriter;
    }
    ```

 Above method is called using joblauncher

```  JobParameters params = new JobParametersBuilder()
                    .addString("JobID", String.valueOf(System.currentTimeMillis()))
                    .addString("fileName", "sample_file.txt")
                    .toJobParameters();
          
            JobExecution jobExecution =jobLauncher.run(job, params);

I have tried @Lazy annotation, still when server is coming up it throws that exception. I am using multi node cluster and it add entries for each node while server is coming up in BATCH_STEP_EXECUTION table.How to avoid this exception while server startup for the first time?

I have used below property in springboot application.properties to disable spring batch by default as I am triggering and passing parameter through cron trigger.

spring.batch.job.enabled=false

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM