Requirement:
Current implementation:
MultiResourcePartitioner
to read multiple files and TaskExecutor
to make it multithreaded.
Step s1 = sbf.get("file-db")
.<Person, Person>chunk(1500)
.reader(reader())
.writer(jdbcWriter())
.build();
Step master = sbf.get("master-step")
.listener(stepExecutionListener())
.partitioner("master", partitioner())
.step(s1)
.taskExecutor(taskExecutor())
.build();
The problem:
Persisting to database only the entries from single file in each thread is inefficient. Is there a possibility to pool the entities in some data sink BEFORE committing to DB using built-in spring-batch functionality? Or is the only way to achieve this is to push entities to a simple Queue and then read from it?
我发现的“解决方法”是实现一个自定义Partitioner
,类似于MultiResourcePartitioner
,并将文件名列表推送到ExectionContext
。
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.