[英]What is the fastest way to import data from application (Java) into temporary table?
I am developing an application with Java, Spring and PostgreSQL. 我正在使用Java,Spring和PostgreSQL开发一个应用程序。 I am using Spring-JDBC as database layer.
我使用Spring-JDBC作为数据库层。
In my Java application, I have around 10.000 rows which need to be inserted in a temporary table. 在我的Java应用程序中,我有大约10,000行需要插入临时表中。 What is the fastest way to import them?
导入它们的最快方法是什么?
I already tried: 我已经尝试过:
create temporary table my_table on commit drop;
with data as (values (...), (...), ...) -- all 10.000 rows enumerated
insert into my_table select * from data;
But this fails because the query becomes too big to parse. 但是这会失败,因为查询变得太大而无法解析。
Should I simply send smaller batches, or is there a more clever approach for streaming data to the database? 我应该简单地发送较小的批次,还是有更聪明的方法将数据流传输到数据库?
update 更新
What I am doing: 我在做什么:
In the database, there is a table with, let's say, 'external entities'. 在数据库中,有一个表,比方说,“外部实体”。 Every few hours, I get an update (via ActiveMQ broker) with the complete current set of entities.
每隔几个小时,我就会得到一个包含完整的当前实体集的更新(通过ActiveMQ代理)。 This set needs to be in my database.
这个集合需要在我的数据库中。 I do the following:
我做以下事情:
I cannot simple truncate my actual table and insert directly into the actual table, because of foreign constraints to that table and triggers on that table. 我不能简单地截断我的实际表并直接插入到实际的表中,因为该表的外部约束和该表上的触发器。 In other words, I am only reflecting the actual changes.
换句话说,我只反映实际的变化。
try using copy from stdin
. 尝试使用
copy from stdin
。 It would be the smartest and the fastest way to put 10K rows 这将是放置10K行的最聪明,最快捷的方式
http://pgpen.blogspot.ie/2013/05/using-copy-in-your-jdbc-code.html http://pgpen.blogspot.ie/2013/05/using-copy-in-your-jdbc-code.html
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.