I have the following issue: I have a file where each line results in 1 or more items being written to a database. I need to do a lookup from another system to find out how many items need to be written. Each of the items resulting from a single line must be transformed using a chain of item processors and finally written to multiple tables in a DB.
Because each item needs to write to multiple tables, they must each be in their own transaction. Because of this, I can't just have an ItemProcessor<Foo, List<Bar>>
which handles it. In that case -- even with a commit-interval
of 1 -- I would end up with multiple items in the same transaction.
I have seen this stack overflow question already. The accepted answer doesn't help me because of the transaction issue. The other answer about using a Spring Integration splitter sounds intriguing. However, it doesn't give a ton of details. How do I define the reader's output as the channel input? How would I define an output channel which goes to my item writer? How could I still run an item processor chain on each newly divided item?
I haven't been able to find any examples using the splitter within a spring batch job. Any recommendations would be appreciated.
Had same problematic with FlatFileItemReader, minus the transaction issue though. Dealt with it that way :
public class FlatFileItemListWriter<T> extends FlatFileItemWriter<T> {
/**
* {@inheritDoc}
*
* @see org.springframework.batch.item.file.FlatFileItemWriter#write(java.util.List)
*/
@Override
@SuppressWarnings("unchecked")
public void write(List<? extends T> itemsLists) throws Exception {
List<T> items = new ArrayList<T>();
for (Object item : itemsLists) {
items.addAll((List<T>) item);
}
super.write(items);
}
}
That way, the FlatFileItemWriter deals with the list the way it would have if my processor didn't produce lists.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.