简体   繁体   中英

PHP script much slower on server than on WAMP localhost

So as I've stated above: I have a quite big, but quite easy script, which gets a single json file from a website, decodes it, then saves the data in a PostgreSQL database. It takes about 4 to 5 minutes to completely finish (about 300 000 records) on my computer (i3M CPU, laptop) but takes a about 10-15 times longer to do the same thing on the server which I've just rented.

The dedicated server has an intel Xeon quad (3Ghz) CPU, and on an overall level much better specifications with much better internet access, so I'm pretty sure it has nothing to do with that. It runs the latest Debian, Apache/2.4.10, PHP Version 5.6.22-0, PostgreSQL 9.5.
I've tried to copy the settings and modules from the WAMP settings, figured it my help. Unfortunately, it didn't. Not sure what information might help in solving this problem, but I'm more than happy to answer to any of the questions.
I am almost positive it has something to do with some option which I must have skipped, so any help would be much appreciated.

PS: WAMP uses: 2.4.17 Apache, 5.6.16 PHP, 9.5 PostgreSQL.

Performance-Issues can be caused by much things.

  • First of all, i'm quite sure (?) your local PC has a SSD Disk, while the Server might run on "default SAS-Drives". This can make a huge difference, when it comes down to writing / reading stuff to the hdd. Especially for "Random-IO" (Ie much Selects with conditions and a to "small" dbbuffer)
  • Do you have a "root" server, or a rented VM? If it's a VM keep in mind, that you share your resources with other vms as well. (especially the access on the SAS-Disks, but also CPU-Time)

You should write some Scripts to "identify" the bottleneck:

  • Generate a PHP Script which uses as much CPU-Power as possible, add a few million iterations, compare the results. (Guess this is not the problem)
  • Generate a PHP Scirpt, that uses massive DISK-IO - compare the results for thausands of iterations.
  • Generate a PHP script, that uses a massive amount of memory - compare again. (Be carefull on this - using tooo much memory will cause data be written to the harddisk which "destroys" the result)

if you didn't encounter a (unexpected) difference by now, you have eliminated hardware-issues. - Repeat the task for heavy database loads to figure out if the database might be missconfigured. Sometimes it is just a simple "Boolean" flag, that might have heavy performance-impacts.

To expand on dognose's answer, I find that optimizing your DB access can make a big difference in performance.

It might be interesting to see what happens to the run time if you comment out the DB queries. This will tell you how much of the run time is spent on the DB.

If the DB is taking significant time, try batching your requests. Instead of sending a single insert at a time to the DB, try holding them in a variable and sending over 50 to 100 inserts (or more) in a batch. Depending on how you setup your DB connection, there could be significant overhead for each request.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM