簡體   English   中英

Teradata CLOB批處理是否對JDBC無效?

[英]Is Teradata CLOB batch processing useless with JDBC?

我想我知道這個問題的答案,但我也想與這里的專家確認。 我認為答案是:“是的,因為批大小限制為16,這太小了。因此,實際上,Teradata CLOB的批處理是沒有用的。”

這是我的理由。 這是有效的Java代碼。 我使用流將表從一個數據庫連接復制到另一個數據庫連接

public class TestClob {

public void test() throws ClassNotFoundException, SQLException, IOException { 

Connection conn1, conn2; 
conn1 = DriverManager.getConnection(..., user, pass);
conn2 = DriverManager.getConnection(..., user, pass); 

Statement select = conn1.createStatement(); 
ResultSet rs = select.executeQuery("SELECT TOP 100 myClob FROM myTab " );

int totalRowNumber = 0; 

PreparedStatement ps = null; 
Clob clob = null; 
Reader clobReader = null; 

while (rs.next()) { 
totalRowNumber++; 
System.out.println(totalRowNumber);
clob = rs.getClob(1); 
clobReader = clob.getCharacterStream(); 
ps = conn2.prepareStatement("INSERT INTO myTab2 (myClob2) values (?) ");
ps.setCharacterStream(1, clobReader , clob.length() ); 
ps.execute(); // HERE I just execute the current row 
clob.free(); //  FREE the CLOB and READER objects 
clobReader.close(); 
} 

conn2.commit(); 
ps.close(); 
select.close(); 
rs.close(); 

根據Teradata規則,與LOB相關的對象不能同時打開超過16個。

因此,我必須確保分別釋放和關閉Clob clobReader clobReader

所以我有兩個選擇

1)執行executeBatch()方法,一次最多具有16個Clob clobReader clobReader對象。

2) execute()方法,然后立即關閉Clob clobReader clobReader對象。

結論:Teradata CLOB批處理插入對JDBC無效。 嘗試插入Clob時,批次大小不能超過16

請幫助我,讓我知道我是否正確理解

我沒有其他辦法

您可以在此處找到超過16個Clob的批量插入示例。

 import java.io.BufferedReader;
 import java.io.IOException;
 import java.io.Reader;
 import java.io.StringReader;
 import java.security.GeneralSecurityException;
 import java.sql.Connection;
 import java.sql.DriverManager;
 import java.sql.PreparedStatement;
 import java.sql.ResultSet;
 import java.sql.SQLException;
 import java.sql.Statement;
 import java.util.ArrayList;
 import java.util.List;


 public class ClobBatch {

    public static void main(String[] args) throws GeneralSecurityException, IOException, SQLException {

        String databaseCredentials = ExternalData.getCredentials();
        Connection c1=DriverManager.getConnection(databaseCredentials);
        Connection c2=DriverManager.getConnection(databaseCredentials);

        String sql="create volatile table clob_test_input ( id bigint, longobj clob) no primary index on commit preserve rows;";
        Statement s=c1.createStatement();
        s.execute(sql);

        String sql2="create volatile table clob_test_target ( id bigint, longobj clob) no primary index on commit preserve rows;";
        Statement s2=c2.createStatement();
        s2.execute(sql2);

        System.out.println("Inserting test data");
        PreparedStatement ps=c1.prepareStatement("insert into clob_test_input (id, longobj) values (?,?);"); 
        for(int i=0; i<1000; i++) {
            String st=randomLargeString();
            ps.setInt(1, i);
            ps.setCharacterStream(2, new BufferedReader(new StringReader(st)), st.length());
            ps.addBatch();
        }
        ps.executeBatch();

        System.out.println("reading test data from input table");
        Statement select=c1.createStatement();
        ResultSet rs=select.executeQuery("select * from clob_test_input");


        PreparedStatement ps2=c2.prepareStatement("insert into clob_test_target (id, longobj) values (?,?);"); 
        List<Reader> readerToClose=new ArrayList<Reader>(); 
        System.out.println("start batch creation");
        while(rs.next()) {
            int pos=rs.getInt("id");
            Reader rdr=new BufferedReader(rs.getCharacterStream("longobj"));

            StringBuffer buffer=new StringBuffer();
            int c=0;
            while((c=rdr.read())!=-1) {
                buffer.append((char)c);
            }
            rdr.close();
            ps2.setInt(1, pos);
            Reader strReader= new StringReader(buffer.toString());
            ps2.setCharacterStream(2, strReader,buffer.length());
            readerToClose.add(strReader);
            ps2.addBatch();
        }
        System.out.println("start batch execution");
        ps2.executeBatch();
        rs.close();
        c1.commit();
        c2.commit();

        for(Reader r:readerToClose) r.close();

        Statement selectTest=c2.createStatement();
        ResultSet rsTest=selectTest.executeQuery("select * from clob_test_target");
        System.out.println("show results");
        int i=0;
        while(rsTest.next()) {
            BufferedReader is=new BufferedReader(rsTest.getCharacterStream("longobj"));
            StringBuilder sb=new StringBuilder();
            int c=0;
            while((c=is.read())!=-1) {
                sb.append((char)c);
            }
            is.close();
            System.out.println(""+rsTest.getInt("id")+' '+sb.toString().substring(0,80));
        }

        rsTest.close();
    }


    private static String randomLargeString() {
        StringBuilder sb=new StringBuilder();
        for(int i=0;i<10000; i++) {
            sb.append((char) (64+Math.random()*20));
        }
        return sb.toString();
    }
 } 

我已經研究了一些樂觀的假設(例如10000個字符的Clobs),但是可以通過使用臨時文件而不是StringBuffers來減少內存占用。

該方法基本上是找到一些“緩沖區”(在內存中或臨時文件中),以保留源數據庫中的數據,以便您可以關閉輸入ClobReader。 然后,您可以從沒有限制16的緩沖區中批量插入數據(您仍然有內存限制)。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM