[英]How to speed up / break up process in multiple parts. Rss, Curl, PHP
Im experimenting with some RSS reader/fetcher im writing at the moment. 我正在尝试一些RSS阅读器/提取器即时写作。 Everything is going smoothly except 1 thing. 除了一件事,一切都很顺利。 It's terribly slow. 这非常慢。
Let me explain: 让我解释:
For now i am looping through 11 feeds. 现在我循环浏览11个Feed。 Which gives me a page loading time of 18 seconds. 这给了我一个18秒的页面加载时间。 This is without updating the database. 这是在不更新数据库的情况下。 When there are some new articles found, it goes up to 22 seconds (on localhost). 当找到一些新文章时,它最多可达22秒(在localhost上)。
On a live webserver, my guess is that this will be even slower, and maybe goes beyond the limit php is setup to. 在实时网络服务器上,我的猜测是,这将更慢,并且可能超出了php设置的限制。
So my question is, what are your suggestions to improve speed.. and if this is not possible, whats the best way to break this down into multiples executions, like say 2 feeds at a time? 所以我的问题是,你有什么建议来提高速度..如果这是不可能的,最好的方法是将其分解为多次执行,比如一次说2个Feed? I'd like to keep it all automated, dont want to click after every 2 feeds. 我想保持全部自动化,不想在每2个Feed后点击。
Hope you guys have some good suggestions for me! 希望你们有一些好的建议给我!
If you want some code example let me know and ill paste some 如果你想要一些代码示例让我知道并且粘贴一些
Thanks! 谢谢!
I would suggest you use a cronjob or a daemon that automatically synchronizes the feeds with your database by running a php script. 我建议您使用cronjob或守护进程,通过运行php脚本自动将提要与数据库同步。 That would remove the delay from the user's perspective. 这将从用户的角度消除延迟。 Run it like every hour or whatever suits you. 像每小时或任何适合你的方式运行它。
Though first, you should possibly try and figure out which parts of the process are actually slow. 虽然首先,您应该尝试找出过程的哪些部分实际上很慢。 Without the code it's hard to tell what could be wrong. 没有代码,很难说出什么可能是错的。
Possible issues could be: 可能的问题可能是:
Here are some suggestions. 以下是一些建议。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.