簡體   English   中英

如何在Oracle數據庫中進行批量插入

[英]How to make batch insert in Oracle Database

我需要將批量插入到Oracle DB中,但我對如何制作該批次感到困惑。

String INSERT = "INSERT INTO LOGS(METHOD,USER,START_DATE,RESPONSE_TIME,IS_ERROR) VALUES (?,?,?,?,?)";

private synchronized void saveToDBAndClear(ConcurrentHashMap<Long, Logs> logs) {
    List<Logs> list = new ArrayList<>(logs.values());
    logService.insertLog(list);
    initLogMap();
}


    public void insertLog(List<Logs> logsList) {

        int[] insertedLog = jdbcTemplate.batchUpdate(INSERT, new BatchPreparedStatementSetter() {

            @Override
            public void setValues(PreparedStatement ps, int i) throws SQLException {
                ps.setString(1, logsList.get(i).getMethod());
                ps.setString(2, logsList.get(i).getUser());
                ps.setTimestamp(3, logsList.get(i).getStartDate());
                ps.setLong(4, logsList.get(i).getResponseTime());
                ps.setString(5, logsList.get(i).getIsError());
            }

            @Override
            public int getBatchSize() {
                return logsList.size();
            }
        });
        logger.info("It was inserted {} logs into Logs", insertedLog.length);
    }

我必須這樣做

List<Logs> list = new ArrayList<>(logs.values());

因為我不知道如何使用jdbcTemplate.batchUpdate()

對於每個ConcurrentHashMap並收集100或1000卷中的批處理以進入DB。

有人可以幫助我嗎?

PS

我試過了

public class LogProcessor {
private int batchSize = 10;
private final double initialCapacity = 1.26;
private ConcurrentHashMap<Long, Logs> logMap;
private AtomicLong logMapSize;

    private void initLogMap() {
this.logMap = new ConcurrentHashMap<>((int) (batchSize * initialCapacity));
this.logMapSize = new AtomicLong();
    }

public void process(LogKeeper keeper){
 LogHandler log = keeper.getLog();
 Long i = logMapSize.incrementAndGet();
 logMap.put(i, log.toJdbc());
 System.out.println("Lines inside the map = "+logMap.size());
 if (i % batchSize == 0) {
  System.out.println("Reached batchSize and = " + batchSize);
                    saveToDBAndClear(logMap);
                }

private synchronized void saveToDBAndClear(ConcurrentHashMap<Long, Logs> logs) {
List<Logs> list = new ArrayList<>(logs.values());
System.out.println("Created list with size = "+ list.size());
logService.insertLog(list);
initLogMap();
System.out.println("Now size of map = "+ logs.size()+" and AtomicLong = "+logMapSize.intValue() );
}

    @TransactionalRollback
    public void insertLog(List<Logs> logsList) {
        System.out.println("Inside insertLog method");
        int[][] insertedLog=jdbcTemplate.batchUpdate(INSERT, logsList, 15, (ps, arg) -> {
            ps.setString(1, arg.getMethod());
            ps.setString(2, arg.getClient());
            ps.setTimestamp(3, arg.getStartDate());
            ps.setLong(4, arg.getResponseTime());
            ps.setString(5, arg.getIsError());
        });
        System.out.println("It was inserted " + insertedLog[0].length + " logs into DB");
    }
}

還有一些日志信息

現在,正如你所看到的那樣。 我的batchSize作為私有字段是10.在batchUpdate中,我放了15.我想,如果我發送到insertLog方法列表,例如大小等於1或100,它將收集到大小等於15的批次並將發送到DB,但只插入列表所包含的卷。

因此,我必須准確地收集我在Map中需要的batchSize,然后將其發送到insertLog方法。 如果可以的話,我只會將Logs日志發送到insertLog方法並輸入類似的東西

 public void insertLog(Logs logs){
jdbcTemplate.batchUpdate(INSERT, logs, 1000, (ps, arg) ->(...));}

之后PS可以幫助那個bathSize插入嗎? 是否可以在沒有像10%batchSize = 0這樣的簡單驗證的情況下進行batchInsert或batchUpdate?

看一下PLSQLSample.java以供參考

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM