简体   繁体   中英

Best optimizing for a large SQL Server table (100-200 Mil records)

在处理包含100-200百万条记录的大型SQL Server 2005表时,可以执行哪些最佳选项/建议和优化?

Since you didn't state the purpose of the database, or the requirements, here are some general things, in no particular order:

  1. Small clustered index on each table. Consider making this your primary key on each table. This will be very efficient and save on space in the main table and dependent tables.
  2. Appropriate non-clustered indexes (covering indexes where possible)
  3. Referential Integrity
  4. Normalized Tables
  5. Consistent naming on all database objects for easier maintenance
  6. Appropriate Partitioning (table and index) if you have the Enterprise Edition of SQL Server
  7. Appropriate check constraints on tables if you are going to allow direct data manipulation in the database.
  8. Decide where your business rules are going to reside and don't deviate from that. In most cases they do not belong in the database.
  9. Run Query Analyzer on your heavily used queries (at least) and look for table scans. This will kill performance.
  10. Be prepared to deal with deadlocks. With a database of this size, especially if there will be heavy writing, deadlocks could very well be a problem.
  11. Take ample advantage of views to hide query join complexity and potential for query optimization and flexible security implementation.
  12. Consider using schemas to better organize data and flexible security implementation.
  13. Get familiar with Profiler. With a database of this size, you will more than likely be spending some time trying to determine query bottlenecks. Profiler can help you here.

根据经验,如果一个表包含超过2500万条记录,您应该考虑表(和索引)分区,但此功能仅适用于SQL Server企业版(相应地,开发人员版)。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM