简体   繁体   中英

How to write child record using itemWriter in spring batch

my situation:

I have class A which I read from db in readerItem. Then I need to processing this class A and create class B which I am doing in itemProcessor. And finally I save this class B into db in itemWriter.

Problem: In processing I also need to create class C (about 1 mil records) which has foreign key of class B and save this class C. How should I do this.

I cant do something like this: Because as I wrote I have about 1 mil records which should I need to store in memory what is about 2gb of space. SO how should I solve this problem.

public class BWriter extends BaseItemWriter<B> {

    public void write(List<? extends B> data) throws Exception {
        logger.info("Start writing: " + data);
        for (B item : data) {
            myCustomDao.saveB(item);
            for (C itemC : item.getC()) {
                itemC.setB(item);
                myCustomDao.saveC(itemC);
            }
        }
    }
}

UPDATE:

possible solution which doesnt include spring batch which I want:

    List<C> cList = new ArrayList<C>();
    int i = 0;
    String line;
    while ((line = reader.readLine()) != null) {
        String[] data = line.split(";");
        if (data.length > 1 && !StringUtils.isBlank(data[1])) {
            C cItem = new C();
            cItem.set(...);
            cList.add(i, cItem);
            if (++i >= 1000) {
                myCustomDao.save(cList);
                cList = new ArrayList<C>();
                i = 0;
            }
        }
    }
  if (!cList.isEmpty())
                myCustomDao.save(cList);

If reduce commit-interval to a small value in not an option because one B element can have up to 1mil C objects you can do that:

Process class A to class B without creating the C objects in processed B object;
in your BWriter attach a ItemWriteListener<B>.afterWrite() where you create/save C objects (related to List<B> received in listener) one-by-one so your memory consumption in low, but you are guaranted to work in transaction boundaries.

If problem is due to use of Hibernate instead of plain JDBC you can think to use a Stateless session or flush()/clear() session manually; 1mil record for a database is not a big number
Unfortunately ORM is not the best choice when you have large amout of data.

My 2 cents, I'm quite new with Spring-batch as you.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM