简体   繁体   English

如何优化此PHP脚本?

[英]How can I optimize this PHP script?

I will run a script to check if some trade-offers are accepted through steam's web API. 我将运行一个脚本来检查Steam的Web API是否接受了一些折衷方案。 I will run it using cronjob every 15th second. 我将每隔15秒使用cronjob运行它。 But I want it to be optimized and run as fast as possible, I feel like I have done this very poorly. 但是我想对其进行优化并尽快运行,我觉得我做得很差。

As you can see, I've put some comments that tells what the script is doing. 如您所见,我已经添加了一些注释来告诉脚本正在做什么。 But I will try here as well. 但我也会在这里尝试。

  • It collects all the new trade-offers from the database 它从数据库中收集所有新的折衷方案
  • It checks if the trade offer has been cancelled or not 它检查贸易报价是否已被取消
  • If it is not cancelled, aka accepted. 如果未取消,则也被接受。 Then it collects information about the offer. 然后,它收集有关报价的信息。
  • If the bot-inventory contains the item, that the player deposited. 如果机器人库存中包含该物品,则玩家存放该物品。 The database will set status = 1 数据库将状态设置为1
  • Then it will delete the trade-offer, as it has been completed 然后,它会删除折价,因为它已经完成

I feel like this script is running slowly, should I change to mysqli? 我觉得这个脚本运行缓慢,我应该改为mysqli吗? or maybe replace mysql_fetch_array with assoc? 还是用assoc代替mysql_fetch_array? What can I do to optimize this. 我该怎么做才能优化这一点。 It is pretty important that it runs fast, quicker than 15 seconds. 快速运行,快于15秒非常重要。

    <?php
require('xxxxxx/xx.php');

        //Getting bot-items
         $jsonInventory = file_get_contents('https://steamcommunity.com/profiles/76561xxxxx8959977/inventory/json/730/2');
         $data = json_decode($jsonInventory, true);

        //Getting tradeoffers
        $tradeoffers = mysql_query("SELECT * FROM tradeoffers");
        while($trade = mysql_fetch_array($tradeoffers)) {

        //Getting information about trade-offer
        $url = file_get_contents("https://api.steampowered.com/IEconService/GetTradeOffer/v1/?key=3593xxxxxB6FFB8594D8561374154F7&tradeofferid=".$trade['tradeofferid']."&language=en_us");
        $json = json_decode($url, true);

        //Checking if trade has been completed
        if (isset($json['response']) && isset($json['response']['offer'])) {


        if($json['response']['offer']['trade_offer_state'] == 1 || $json['response']['offer']['trade_offer_state'] == 5 || $json['response']['offer']['trade_offer_state'] == 6 || $json['response']['offer']['trade_offer_state'] == 7 || $json['response']['offer']['trade_offer_state'] == 8 || $json['response']['offer']['trade_offer_state'] == 10 || $json['response']['offer']['trade_offer_state'] == 11) {
            mysql_query("DELETE FROM tradeoffers WHERE tradeofferid = '".$trade['tradeofferid']."'");
            mysql_query("DELETE FROM items WHERE tradeofferid = '".$trade['tradeofferid']."'");
        } 

            if($json['response']['offer']['trade_offer_state'] == 3) {


            if(isset($data['rgDescriptions'])) {

                $itemsinfo = mysql_query("SELECT * FROM items WHERE tradeofferid = '".$trade['tradeofferid']."'");
                while($item = mysql_fetch_array($itemsinfo)) {

                foreach($data['rgInventory'] as $inv) {
                $desc = $data['rgDescriptions'][ $inv['classid'] .'_'. $inv['instanceid'] ]; 

            if($desc['icon_url'] == $item['iconurl']) {
                mysql_query("UPDATE items SET assetid = '".$inv['id']."' WHERE iconurl = '".$item['iconurl']."'");
                mysql_query("UPDATE items SET status = 1 WHERE iconurl = '".$item['iconurl']."'");

                   }
                }    
              }
            }
            //Deleting the trade-offer from the database.
            mysql_query("DELETE FROM tradeoffers WHERE tradeofferid = '".$trade['tradeofferid']."'");
        }
    } else {
        mysql_query("DELETE FROM tradeoffers WHERE tradeofferid = '".$trade['tradeofferid']."'");
        mysql_query("DELETE FROM items WHERE tradeofferid = '".$trade['tradeofferid']."'");
    }
 }
 echo 'Finished';
?>

First, I'd advise you to move away from mysql_* functions and use either PDO or mysqli . 首先,我建议您远离mysql_*函数,而使用PDOmysqli

Optimization. 优化。 I've not run your code but some pointers: 我没有运行您的代码,但有一些指针:

"SELECT * FROM" might be slow. "SELECT * FROM"可能很慢。 Try to use only the fields you need. 尝试仅使用您需要的字段。

You are updating on 'WHERE iconurl = '".$item['iconurl']."'"'. Is this field indexed? 您正在更新'WHERE iconurl = '".$item['iconurl']."'"'.此字段是否已编制索引?

Is it necessary to DELETE these records? 是否有必要删除这些记录? That is a slow operation. 这是一个缓慢的操作。 What happens if you flag them, eg complete = 1? 如果您标记它们,例如complete = 1? ,会发生什么complete = 1? (you may later still delete them in one go if your table gets too crowded) (如果您的桌子太拥挤,您以后可以一口气删除它们)

One level for increasing performance is to switch from file_get_contents to curl for getting data from the API. 提高性能的一个层次是从file_get_contents切换到curl以从API获取数据。 curl is usually much faster. curl通常更快。 Also, with curl you can run multiple requests in parallel, which brings another performance boost (if you are able to parallelize your requests). 另外,使用curl可以并行运行多个请求,这又可以提高性能(如果您能够并行化请求)。

See also this question . 另请参阅此问题

Another level is to parallelize your database calls which you could do after you migrated to mysqli . 另一个层次是并行化数据库调用,这可以在迁移到mysqli之后进行。 See this question for details. 有关详细信息,请参见此问题 (again assuming its possible and makes sense logic-wise) (再次假设其可能,并且在逻辑上有意义)

There have been some good answers here, and I'll start by seconding what they've said in a brief summary and then add in my two cents. 这里有一些很好的答案,我将首先在简短的摘要中附上他们所说的内容,然后再加上两美分。

(1) Your biggest performance gain will come from Erik's two suggestions regarding cURL. (1)您最大的性能提升将来自Erik关于cURL的两个建议。 Switching to cURL will provide a small increase in performance (maybe 0.5 to 1 second or more per call), but using multi-curl to call both URLs in parallel will likely provide THE ABSOLUTE BIGGEST BENEFIT of all the suggestions here, no questions asked (since you're doing these network fetches in a loop). 切换到cURL会带来一点点性能提升(每次调用可能会增加0.5到1秒或更多),但是使用多卷曲并行调用两个URL可能会提供此处所有建议的绝对最大收益,没有问题(因为您正在循环执行这些网络抓取操作)。 Here's a class that someone else wrote that simplifies multi-curl a bit: 这是别人写的一个类,它稍微简化了多卷曲:

<?php
// LICENSE: PUBLIC DOMAIN
// The author disclaims copyright to this source code.
// AUTHOR: Shailesh N. Humbad
// SOURCE: https://www.somacon.com/p539.php
// DATE: 6/4/2008

// index.php
// Run the parallel get and print the total time
$s = microtime(true);
// Define the URLs
$urls = array(
  "http://localhost/r.php?echo=request1",
  "http://localhost/r.php?echo=request2",
  "http://localhost/r.php?echo=request3"
);
$pg = new ParallelGet($urls);
print "<br />total time: ".round(microtime(true) - $s, 4)." seconds";

// Class to run parallel GET requests and return the transfer
class ParallelGet
{
  function __construct($urls)
  {
    // Create get requests for each URL
    $mh = curl_multi_init();
    foreach($urls as $i => $url)
    {
      $ch[$i] = curl_init($url);
      curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, 1);
      curl_multi_add_handle($mh, $ch[$i]);
    }

    // Start performing the request
    do {
        $execReturnValue = curl_multi_exec($mh, $runningHandles);
    } while ($execReturnValue == CURLM_CALL_MULTI_PERFORM);
    // Loop and continue processing the request
    while ($runningHandles && $execReturnValue == CURLM_OK) {
      // Wait forever for network
      $numberReady = curl_multi_select($mh);
      if ($numberReady != -1) {
        // Pull in any new data, or at least handle timeouts
        do {
          $execReturnValue = curl_multi_exec($mh, $runningHandles);
        } while ($execReturnValue == CURLM_CALL_MULTI_PERFORM);
      }
    }

    // Check for any errors
    if ($execReturnValue != CURLM_OK) {
      trigger_error("Curl multi read error $execReturnValue\n", E_USER_WARNING);
    }

    // Extract the content
    foreach($urls as $i => $url)
    {
      // Check for errors
      $curlError = curl_error($ch[$i]);
      if($curlError == "") {
        $res[$i] = curl_multi_getcontent($ch[$i]);
      } else {
        print "Curl error on handle $i: $curlError\n";
      }
      // Remove and close the handle
      curl_multi_remove_handle($mh, $ch[$i]);
      curl_close($ch[$i]);
    }
    // Clean up the curl_multi handle
    curl_multi_close($mh);

    // Print the response data
    print_r($res);
  }

}

The catch here is that this approach depends heavily on how many trade offers you have at any given time, since you're doing a network call for each one. 这里要注意的是,这种方法在很大程度上取决于您在给定时间内拥有多少交易要约,因为您要为每个交易要进行网络呼叫。 If you have 1,000 trade offers, you might have to break them up into smaller chunks so you're not slamming the steam API with a ton of calls all at the same time. 如果您有1,000个交易要约,则可能必须将它们分解成较小的块,这样就不会同时用大量调用猛烈地破坏Steam API。

(2) If you're running it every 15 seconds, then you're likely incurring some overhead from the script starting up. (2)如果每隔15秒运行一次,那么脚本启动可能会产生一些开销。 You could run this script in an infinite loop to eliminate that startup time, although you'd have to ensure there's no memory leaks so your script doesn't eventually run out of memory: 您可以无限循环地运行此脚本以消除启动时间,尽管您必须确保没有内存泄漏,以便脚本最终不会耗尽内存:

<?php
set_time_limit(0);
while(true)
{
  ...your code here...

  // Wait 15 seconds before the next round
  sleep(15);
}

(3) I'm assuming your database is pretty small, but if you've got 10k records or more in any given table, then indexes ARE going to be important, as Herco mentioned. (3)我假设您的数据库非常小,但是,如果您在任何给定的表中有10k或更多的记录,那么正如Herco所说,索引将变得很重要。 Without good indexes, your SQL queries are going to suffer. 没有良好的索引,您的SQL查询将受到影响。

However, I would focus less on #3 and more on #1 and #2 for your best improvements. 但是,为了您的最佳改进,我将较少关注#3,而将重点放在#1和#2。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM