简体   繁体   中英

Increase postgresql-9.5 performance on 200+ million records

I have 200+ millions of records in postgresql-9.5 table. Almost all queries are analytical queries. To increase and optimize the query performance so far I am trying with Indexing and seems that its not sufficient. What are the other options i need to look it into?

Depending on where clause condition create partitioned table ( https://www.postgresql.org/docs/10/static/ddl-partitioning.html ) ,it will reduce query cost drastically,also if there is certain fixed value in where clause do partial indexing on partitioned table. Important point check order of columns in where clause and match it while indexing

You should upgrade to PostgreSQL v10 so that you can use parallel query.

That enables you to run sequential and index scans with several background workers in parallel, which can speed up these operations on large tables.

A good database layout, good indexing, lots of RAM and fast storage are also important factors for good performance of analytical queries.

If the analysis involves a lot of aggregation, consider materialized views to store the aggregates. Materialized views do take up space and they need to be refreshed too. But they are very useful for data aggregation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM