简体   繁体   中英

SELECT query in chunks using PostgreSQL

I have a table with well over a million entries. I'm putting all this data into perl and storing it into a variable ($query = "SELECT * FROM table1". The problem now is that I'm using a lot of resources (mostly memory). Given my novice state with perl and postgres, How would I transform that query into chunks or parts?

For example,

$query = "SELECT * FROM table1 LIMIT 100000";

Would put 100K results into $query. table1 is 10M records in size. How would I transform it so that only 100K results are stored into $query at a time until the query is finished?

First of all do you really need all 10M records? don't think so and thus retrieve only the records for your work to be done. Second, do you really need all columns data? may not be and in such case select only the columns needed saying select col1, col2, col3 ... from table instead of doing a select *... . Cause, there is no point in getting all data and filling your server's memory.

In worst case, if what you have said is what you actually in need then you have no other way than retrieving all records. You can probably implement paging to get data in batches rather than getting all of them at once.

Pagination?

Maybe this might be of use: https://www.postgresql.org/docs/8.3/static/queries-limit.html

Sorry if I have misunderstood.

Trick :

you can use modulus for chunk data

eg :

you want chunk data from 1000 record, chunk data 4 pieces using seq id you just modulus seq id data 4,

SELECT *
  FROM input_data_control b
 WHERE  b.bill_schedule_month = '201910' 
        AND mod(input_data_control_id, 4) = [1...4]

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM