简体   繁体   中英

How to bulk insert using Spring-Batch?

I'm using spring-batch and spring-data-jpa to read large csv data files and persist (or update existing) entries to a postgresql db using hibernate.

How do I have to configure spring to make using of batch/bulk inserts?

When I configure the job step , I set the chunk size accordingly:

StepBuilderFactory step;
step.chunk(10000).reader(csvReader).writer(jpaItemWriter).build();

Do I further have to be concerned about the hibernate.jdbc.batch_size property? Do I have to also set it, maybe to the same size as the chunk size?

When you set the chunk size then your reader will send data to the writer in specified chunks. so the chunk size is what governs the actual batch update to the db in spring batch.

I don't think batch_size matters as you have already setup your job for a specific chunk size for a particular step.

You also need to use the Bulk Update query when writing to the database. Google JDBC Bulk updates for details - so depending on how you are doing the JDBC query you will need to use the corresponding bulk update mechanism

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM