简体   繁体   中英

Spring Batch - Read a byte stream, process, write to 2 different csv files convert them to Input stream and store it to ECS and then write to Database

I have a requirement where we receive a csv file in the form of byte stream through ECS S3 Pre-Signed url. I have to validate the data and write the validation successful and failed records to 2 different csv files and store them to ECS S3 bucket by converting them to InputStream. Also write the successful records to database and also the pre-signed urls of the inbound, success and failure files.

I'm new to Spring Batch. How should I approach this requirement?

If I choose a FlatFileItemReader to read, ItemProcessor to process the data how should I write to different files and to Database?

or

Should I create a job using Tasklets? TIA.

Please find below sample code snippet. Let me know if you face any issues

 //Your InputOutPut DTO This is the key object
   Class BaseCSVDTO {
    // yourCSVMappedFields  
    private SuccessCSVObject successObject;
    private FailureCSVObject failureObject;
   }

   //Read the Files in reader as Normal better create a custom reader if you want to get more control
    @Bean
    public ItemReader<BaseCSVDTO> yourFlatFileItemReader() {
        
         //populate mapped fields automatically by Springbatch
    }

    @Bean
    public CSVProcessor csvValidationProcessor() {
        return new CSVProcessor();
    }
    
    Class CSVProcessor implements ItemProcessor<BaseCSVDTO, BaseCSVDTO> {
        @Override
        public BaseCSVDTO CSVProcessor(BaseCSVDTO eachCSVitem) throws Exception {
            //validateEachItem and put in Success or Failure Object
            //Example of Success
                SuccessCSVObject successObject = new SuccessCSVObject()
                eachCSVitem.setSuccessObject(successObject);
            //Same way for Failure object   
        }
    }

   @Bean
    public CompositeItemWriter compositeWriter() throws Exception {
        CompositeItemWriter compositeItemWriter = new CompositeItemWriter();
        List<ItemWriter> writers = new ArrayList<ItemWriter>();
        writers.add(successCSVWriter());
        writers.add(failureCSVWriter());
        compositeItemWriter.setDelegates(writers);
        return compositeItemWriter;
    }

    @Bean
    public YourItemWriter<BaseCSVDTO> successCSVWriter() {
        return new SuccessWriter();
    }

    @Bean
    public YourItemWriter<BaseCSVDTO> failureCSVWriter() {
        return new FailureWriter;
    }

    
    public class SuccessWriter implements ItemWriter<BaseCSVDTO> {
        @Override
        public void write(List<? extends BaseCSVDTO> items){
        for(BaseCSVDTO baseCSVDTO:items) {
            baseCSVDTO.getSuccessObject
          //write Success CSV 
        }
        }
    }

  public class FailureWriter implements ItemWriter<BaseCSVDTO> {
        @Override
        public void write(List<? extends BaseCSVDTO> items){
        for(BaseCSVDTO baseCSVDTO:items) {
          //write Success CSV 
          baseCSVDTO.getFailureObject
        }
        }
    }

    /// Finally Job step
    @Bean
    public Step executionStep() throws Exception {
        return stepBuilderFactory.get("executionStep").<BaseCSVDTO, BaseCSVDTO>chunk(chunkSize)
                .reader(yourFlatFileItemReader()).processor(csvValidationProcessor()).writer(compositeWriter())
                //.faultTolerant()
                //.skipLimit(skipErrorCount).skip(Exception.class)//.noSkip(FileNotFoundException.class)
                //.listener(validationListener())
                //.noRetry(Exception.class)
                //.noRollback(Exception.class)
                .build();
    }

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM