简体   繁体   English

MySQL条目每月增加一百万的数据库,如何对数据库进行分区以保持查询时间

[英]MySQL database with entries increasing by 1 million every month, how can I partition the database to keep a check on query time

  • I am a college undergrad working on a PHP and MySQL based inventory management system operating on a country-wide level. 我是一所大学本科生,致力于在全国范围内运行的基于PHP和MySQL的库存管理系统。 Its database size is projected to increase by about 1 million plus entries every month with current size of about 2 million. 预计其数据库大小每月将增加约100万个条目,目前的大小约为200万个。
  • I need to prevent the exponential increase in query time which is currently ranges from 7-11 seconds for most modules. 我需要防止大多数模块当前的查询时间呈指数增长,范围是7-11秒。

  • The thing is that the probability of accessing data entered in the last month is much higher as compared to any older data. 事实是,与任何较早的数据相比,访问上个月输入的数据的可能性要高得多。 So I believe partitioning of data on the basis of time of data entry should be able to keep the query time in check. 因此,我相信基于数据输入时间的数据分区应该能够保持查询时间的控制。 So how can I achieve this. 所以我怎么能做到这一点。

  • Specifically speaking I want to have a way to cache the last month's data so that every query searches for the product in the tables having recent data and should search rest of the data in case it is not found in the last 1 month's data. 具体地说,我希望有一种方法可以缓存上个月的数据,以便每个查询都在具有最近数据的表中搜索产品,并应搜索其余数据,以防在最近1个月的数据中找不到该产品。

If you want to use the partitioning functions of MySQL, have a look at this article . 如果要使用MySQL的分区功能,请参阅本文

That being said, there are a few restrictions when using partitions : 话虽如此,使用分区时有一些限制:

  • you cant have indexes that are not in the partition key 您不能具有不在分区键中的索引
  • you loose some database portability as partitioning works quite differently with other databases. 您会失去一些数据库可移植性,因为分区与其他数据库的工作方式大不相同。

You can also handle partitioning manually, by moving old records to an archive table at regular intervals. 您还可以通过定期将旧记录移动到存档表中来手动处理分区。 Of course, you will then have to also implements different code to read those archived records. 当然,您还必须实现不同的代码来读取那些存档记录。

Also note that your query time seems quite long. 另请注意,您的查询时间似乎很长。 I have worked with table much larger than 2 million records with much better access time. 我已经处理了超过200万条记录的表,并且访问时间更短。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM