简体   繁体   English

使用 JDBC 在 Postgres 中插入具有复杂数据类型的大量数据的有效方法是什么

[英]what is the effective way of inserting huge data with complex data types in Postgres using JDBC

We have a requirement to insert huge records in to postgres.我们需要在 postgres 中插入大量记录。 In sqlServer we have used TableTypes for this.在 sqlServer 中,我们为此使用了 TableTypes。 In postgres we have to do in something similar and i have done something like below.在 postgres 中,我们必须做类似的事情,我做了类似下面的事情。

I have a table MyuserDefinedTable with columns like我有一个表 MyuserDefinedTable ,其中的列如下

  • id uuid id uuid
  • date timestamp日期timestamp
  • value integerinteger
  • name String名称String

I have created a Function in postgres like,我在 postgres 中创建了一个 Function,例如,

CREATE or replace function myuserDefinedTableupdatefunction(
    _table_name regclass,
    _column_names text,
    _insert_arr MyuserDefinedTable[]) returns text as
$func$
BEGIN
    EXECUTE format('INSERT INTO %s(%s) SELECT * FROM unnest($1)',_table_name , _column_Names)   
        USING _insert_arr;          
    RETURN 'Saved'; 
EXCEPTION WHEN others THEN
    RETURN 'Failure';
END
$func$ LANGUAGE plpgsql;

Its working fine if i call the function, by passing params directly, like below,如果我通过直接传递参数调用 function,它工作正常,如下所示,

Select myuserDefinedTableupdatefunction ('MyuserDefinedTable','ID,date,value,NAME', ARRAY[ROW('a1563404-0d1d-11ea-b563-3ca82a1c6940', '2019-11-21 00:00:00.0,100.00','3','hello'),ROW('b5563404-7d1d-11ea-b563-3ca82a1c6940', '2019-11-22 00:00:00.0,100.00','4','hii')]::MyuserDefinedTable[])

But its not executing (throwing error like, 'missed left Paranthesis' even though i ensured everything is fine syntactically) if i try to insert values dynamically through JDBC like below,但如果我尝试通过 JDBC 动态插入值,它不会执行(抛出错误,例如“错过左括号”,即使我确保语法上一切正常),如下所示,

String query=Select myuserDefinedTableupdatefunction ('MyuserDefinedTable', 'ID,date,value,NAME', ARRAY[?]::MyuserDefinedTable[])

Object[] arr = objectList.toArray(new Object[0]);  // Object List has list of params appended like ( [0]: a1563404-0d1d-11ea-b563-3ca82a1c6940,2019-11-21 00:00:00.0,100.0,3,hello

// [1]: b1563404-0d1d-11ea-b563-3ca82a1c6940,2019-11-22 00:00:00.0,100.0,4,hi)
Array array = getSession().getConnection().createArrayOf("MyuserDefinedTable", arr);
ps.setArray(1, array);

Can anyone help me with the above?任何人都可以帮我解决上述问题吗?

Also, the execution time of inserting huge records is taking almost double than normal insert statements.此外,插入大量记录的执行时间几乎是普通插入语句的两倍。

Is there any effective way of inserting huge records in to Postgres DB using an array with a complex datatype?有没有使用具有复杂数据类型的数组将大量记录插入 Postgres DB 的有效方法?

Edit : With some suggestions, I also tried using CopyManager like,编辑:有一些建议,我也尝试使用 CopyManager 之类的,

    String sql = "COPY MyuserDefinedTable FROM stdin CSV HEADER DELIMITER ','";
        BaseConnection pgcon = (BaseConnection)getSession().getConnection();
        CopyManager mgr = new CopyManager(pgcon);
        try {
            Reader in = new BufferedReader(new FileReader(new File("C:/Testing/data.csv"))); // The file is my local system
            long rowsaffected  = mgr.copyIn(sql, in);
            System.out.print(rowsaffected);
        }

But the above code returns 0 rows effected.但是上面的代码返回了0行。

Any suggestions please.请有任何建议。

Thanks All谢谢大家

The below solution worked for me, I tried using COPY function by passing Byte Arrays instead of getting data from files (As we are using Cloud, I came to know that file permissions will be a problem in AWS).下面的解决方案对我有用,我尝试使用 COPY function 通过传递字节 Arrays 而不是从文件中获取数据(当我们使用云时,我知道文件权限在 AWS 中将是一个问题)。

https://forums.aws.amazon.com/thread.jspa?threadID=141798 https://forums.aws.amazon.com/thread.jspa?threadID=141798

        List<String> valueList=myData();

        //Converting ArrayList to ByteArray
        try(ByteArrayOutputStream baos = new ByteArrayOutputStream()){
            DataOutputStream out = new DataOutputStream(baos);
            for (String element : valueList) {
                out.writeBytes(element);
            }       
            bytes = baos.toByteArray();
        }
        if(bytes!=null) {
            try(ByteArrayInputStream input = new ByteArrayInputStream(bytes)){

                // Copying records into table using COPY..FROM and COPYMANAGER in Postgres
                String sql = "COPY "+tableName+" FROM stdin delimiter ','  NULL AS 'null' ";
                BaseConnection pgcon = (BaseConnection)getSession().getConnection();
                CopyManager mgr = new CopyManager(pgcon);                   
                long rowsEffected=mgr.copyIn(sql,input);                                        
            }
        }

This seems faster than the way that i posted in the question above.这似乎比我在上面的问题中发布的方式更快。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM