简体   繁体   English

在PHP中缓存以加快速度

[英]Caching in PHP for speeding up

I am running application (build on PHP & MySql) on VPS. 我正在VPS上运行应用程序(在PHP和MySql上构建)。 I have article table which have millions of records in it. 我有文章表,其中有数百万条记录。 Whenever user login i am displaying last 50 records for each section. 每当用户登录时,我将显示每个部分的最后50条记录。

So every-time use login or refresh page it is executing sql query to get those records. 因此,每次使用登录或刷新页面时,它都会执行sql查询以获取这些记录。 now there are lots of users on website due to that my page speed has dropped significantly. 现在,由于我的页面速度已大大降低,因此网站上有很多用户。

I done some research on caching and found that i can read mysql data based on section, no. 我对缓存进行了一些研究,发现我可以根据部分(不)读取mysql数据。 articles eg (section - 1 and no. of articles - 50). 物品,例如(第-1节和第-50条)。 store it in disk file cache/md5(section no.). 将其存储在磁盘文件缓存/ md5(节号)中。

then in future when i get request for that section just get the data from cache/md5(section no). 然后在将来我收到该部分的请求时,只需从缓存/ md5(第no节)中获取数据即可。

Above solution looks great. 上述解决方案看起来很棒。 But before i go ahead i really would like to clarify few below doubts from experts . 但是,在我继续之前,我真的很想澄清专家们的以下几点怀疑。

  • Will it really speed up my application (i know disk io faster than mysql query but dont know how much..) 它会真的加速我的应用程序吗(我知道磁盘io比mysql查询快,但是不知道多少。)
  • i am currently using pagination on my page like display first 5 articles and when user click on "display more" then display next 5 articles etc... this can be easily don in mysql query. 我目前在我的页面上使用分页,例如显示前5篇文章,并且当用户单击“显示更多”然后显示下5篇文章时...等等,这很容易在mysql查询中使用。 I have no idea how i should do it in if i store all records(50) in cache file. 我不知道如果将所有记录(50)存储在缓存文件中该怎么办。 If someone could share some info that would be great. 如果有人可以分享一些信息,那就太好了。
  • any alternative solution if you believe above will not work. 如果您认为上述方法无效,则任何其他解决方案都将无效。
  • Any opensource application if you know. 您知道的任何开源应用程序。 (PHP) (PHP)

Thank you in advance 先感谢您

Regards, Raj 问候,拉吉

I ran into the same issue where every page load results in 2+ queries being run. 我遇到了同样的问题,即每个页面加载都会导致2个以上的查询正在运行。 Thankfully they're very similar queries being run over and over so caching (like your situation) is very helpful. 幸运的是,它们是非常相似的查询,它们一遍又一遍地运行,因此缓存(如您的情况)非常有帮助。

You have a couple options: 您有两种选择:

  1. offload the database to a separate VPS on the same network to scale it up and down as needed 将数据库卸载到同一网络上的单独VPS,以根据需要进行扩展和缩减

  2. cache the data from each query and try to retrieve from the cache before hitting the database 缓存每个查询中的数据,并在访问数据库之前尝试从缓存中检索

In the end we chose both, installing Memecached and its php extension for query caching purposes. 最后,我们选择了两者,都安装了Memecached及其php扩展名以实现查询缓存。 Memecached is a key-value store (much like PHP's associative array) with a set expiration time measured in seconds for each value stored. Memecached是一个键值存储(非常类似于PHP的关联数组),其设置的到期时间以秒为单位,用于存储的每个值。 Since it stores everything in RAM, the tradeoff for volatile cache data is extremely fast read/write times, much better than the filesystem. 由于它将所有内容存储在RAM中,因此易失性缓存数据的权衡是极快的读/写时间,比文件系统要好得多。

Our implementation was basically to run every query through a filter; 我们的实现基本上是通过过滤器运行每个查询。 if it's a select statement, cache it by setting the memecached key to "namespace_[md5 of query]" and the value to a serialized version of an array with all resulting rows. 如果是select语句,请通过将memecached键设置为“ namespace_ [查询的md5]”并将其值设置为包含所有结果行的数组的序列化版本来对其进行缓存。 Caching for 120 seconds (3 minutes) should be more than enough to help with the server load. 缓存120秒(3分钟)应该足以帮助服务器负载。

If Memecached isn't a viable solution, store all 50 articles for each section as an RSS feed. 如果Memecached并非可行的解决方案,则将每个部分的所有50篇文章存储为RSS源。 You can pull all articles at once, grabbing the content of each article with SimpleXML and wrapping it in your site's article template HTML, as per the site design. 您可以一次提取所有文章,根据网站设计,使用SimpleXML抓取每篇文章的内容,并将其包装在您网站的文章模板HTML中。 Once the data is there, use CSS styling to only display X articles, using JavaScript for pagination. 一旦数据存在,请使用CSS样式仅显示X篇文章,并使用JavaScript进行分页。

Since two processes modifying the same file at the same time would be a bad idea, have adding a new story to a section trigger an event, which would add the story to a message queue. 由于两个进程同时修改同一个文件不是一个好主意,因此将新故事添加到部分中会触发一个事件,该事件会将故事添加到消息队列中。 That message queue would be processed by a worker which does two consecutive things, also using SimpleXML: 该消息队列将由工作人员处理,该工作人员还使用SimpleXML执行两个连续的操作:

  1. Remove the oldest story at the end of the XML file 删除XML文件末尾的最旧的故事

  2. Add a newer story given from the message queue to the top of the XML file 将消息队列中给出的新故事添加到XML文件的顶部

If you'd like, RSS feeds according to section can be a publicly facing feature. 如果您愿意,根据部分的RSS提要可以是面向公众的功能。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM