简体   繁体   中英

php performance issue in while loop

I use PHP to proceed latitude/longitude points in order to generate JS and displays points on an OSM Map. I would like to make a new track on the map when I have a pause au 10 minutes or more on the recording.

My dataset as currently about 30000 records on about 10 differents tracks (some tracks have about 300 points, others have thousands).

I encounter a performance issue with PHP. When the loop agregate some hundreds of points, datas are processed with good speed, but if the track has thousands of points, performances drop dramatically.

Here is the time needed to proceed each point for each track

+-----------------+------------------------------+
| Points On Track | Time To Proceed 10000 Points |
+-----------------+------------------------------+
|              21 | 0.75                         |
|           18865 | 14.52                        |
|             539 | 0.79                         |
|             395 | 0.71                         |
|             827 | 0.79                         |
|             400 | 0.74                         |
|             674 | 0.78                         |
|            2060 | 1.01                         |
|            2056 | 0.99                         |
|             477 | 0.73                         |
|             628 | 0.77                         |
|             472 | 0.73                         |
+-----------------+------------------------------+

We can see that when I have a lot of points on a track, performances drop dramaticaly. In this particular case, processing all points require about 30 secondes. If I limit number of points for each track to 500 points, performances are pretty good (about 2,5 seconds to proceed my data set).

I use my Synology DS415play as webserver.

Here is my code :

$dataTab = array();
if ($result = $mysqli->query($sql)) 
    {   
        $count = 0;

        $row = $result->fetch_array();
        $data = $dataTab[$tabPrt] . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
        $date = new DateTime($row['time']);         

        while($row = $result->fetch_array())
        {
            $count++;

            $newDate = new DateTime($row['time']);
            if(($newDate->getTimestamp() - $date->getTimestamp()) > 600)
            {
                array_push($dataTab, $data);
                $data= "";
                $count = 0;
            }

            $data = $data . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
            $date = $newDate;
        }
        array_push($dataTab, $data);
    }

If I limit each track to 500 points like that, performance is pretty good

$dataTab = array();
    if ($result = $mysqli->query($sql)) 
        {   
            $count = 0;

            $row = $result->fetch_array();
            $data = $dataTab[$tabPrt] . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
            $date = new DateTime($row['time']);         

            while($row = $result->fetch_array())
            {
                $count++;

                $newDate = new DateTime($row['time']);
                if(($newDate->getTimestamp() - $date->getTimestamp()) > 600  
                    || $count > 500)
                {
                    array_push($dataTab, $data);
                    $data= "";
                    $count = 0;
                }

                $data = $data . "[" . $row['latitude'] . "," . $row['longitude'] . "]," ;
                $date = $newDate;
            }
            array_push($dataTab, $data);
        }

Thanks

EDIT : I provide a sample of data here : http://109.190.92.126/tracker/gpsData.sql Slow script : http://109.190.92.126/tracker/map.php Normal execution speed by spliting each track (500 pts max) : http://109.190.92.126/tracker/map_split.php

Thanks

If you are getting 18000 records from a database in your worst case scenario, you could move the timestamp check to the query to drop it considerably, it looks like all you are doing is seeing if theres a ten minute gap then pushing to an array which could be done at the mysql level, this way you wouldn't be fetching 18000 rows from the database every time, just the ones you need.

If you post your mysql queries, we can take a look at putting that in there.

Edit: try changing the query to this:

SELECT time, latitude, longitude 
FROM gpsData 
WHERE  time >= '2015-09-01' 
AND provider = 'gps' 
ORDER BY time DESC

Here's the final product (its on Heroku so wait until dyno starts up) http://sove.herokuapp.com/gps/

The idea behind the solution is to calculate diff in timestamps on the server side, and manipulate data in the arrays.

The script finishes in 0.278s on my MBP, takes 15.75MB of memory and final output of $ts is an array of routes (I got 7 in total).

There are quite a few optimizations, including skipping same coordinate points. Zoom bounds is not correct on the map but you'll figure it out. I should have really asked a bounty for this job ... If you like the outcome then let me know, I can share the codebase.

Source: https://gist.github.com/jpaljasma/04f54e0d2fa3a632071e

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM