I am developing an application with Java, Spring and PostgreSQL. I am using Spring-JDBC as database layer.
In my Java application, I have around 10.000 rows which need to be inserted in a temporary table. What is the fastest way to import them?
I already tried:
create temporary table my_table on commit drop;
with data as (values (...), (...), ...) -- all 10.000 rows enumerated
insert into my_table select * from data;
But this fails because the query becomes too big to parse.
Should I simply send smaller batches, or is there a more clever approach for streaming data to the database?
update
What I am doing:
In the database, there is a table with, let's say, 'external entities'. Every few hours, I get an update (via ActiveMQ broker) with the complete current set of entities. This set needs to be in my database. I do the following:
I cannot simple truncate my actual table and insert directly into the actual table, because of foreign constraints to that table and triggers on that table. In other words, I am only reflecting the actual changes.
try using copy from stdin
. It would be the smartest and the fastest way to put 10K rows
http://pgpen.blogspot.ie/2013/05/using-copy-in-your-jdbc-code.html
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.