简体   繁体   English

回放内容有时需要很长时间

[英]Echoing content sometimes takes a very long time

I have a script that builds my webpage in one string ($content) and then echo it to the user. 我有一个脚本,用一个字符串($ content)构建我的网页,然后将其回显给用户。

My script looks like this: 我的脚本看起来像这样:

$time1= microtime(true);
$content = create_content();
$content_time=(microtime(true)-$time1)

$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);

Now $content_time is always well under 0.5s so thats no problem. 现在$ content_time总是低于0.5s,所以没问题。 However a few times a day the $echo_time is well above one second and can even go up to 15 seconds. 然而,每天几次$ echo_time远远超过一秒,甚至可以达到15秒。 The content isn't really big, about 10-20kb and the times at which this happens are completly random, so it's not on the busy times and even happen in the middle of the night. 内容不是很大,大约10-20kb,发生这种情况的时间是完全随机的,所以它不是在忙碌的时候,甚至发生在半夜。

Anybody have any idea what that can be? 有人知道那可能是什么吗?

EDIT The site is hosted on a (remote) dedicated server and only host this site. 编辑该站点托管在(远程)专用服务器上,仅托管此站点。 There is a database involved but like I say the $content_time is well under 1 second, so what this function does can not be the delay. 有一个数据库涉及,但就像我说$ content_time远低于1秒,所以这个功能的作用不能是延迟。

When the time of my site is above a certain value (lets say 5s) I log this. 当我的网站时间超过一定值(比如5s)时,我会记录下来。 Even Googlebots seems to have these issues sometimes so I don't think they use a dial-up connection :) 甚至Googlebots似乎有时会遇到这些问题,所以我认为他们不会使用拨号连接:)

Let's narrow the issue down and factor out some things... 让我们缩小问题范围,并考虑一些事情......

In the question you indicate you're echoing out 10-15kb. 在问题中,你表明你正在回应10-15kb。 That's a significant amount no matter how it's buffered to output--remember php is single thread, once you flush your buffer you got to wait for all the output to happen via the shell or HTTP before the script continues. 这是一个很大的数量,无论它如何缓冲输出 - 记住php是单线程,一旦你刷新你的缓冲区,你必须等待所有输出通过shell或HTTP发生在脚本继续之前。 It will eventually have to flush the internal buffer before continuing the echo. 在继续回声之前,它最终必须刷新内部缓冲区。 To get good time without the flushing overhead of echo 在没有回声的冲洗开销的情况下获得好时光

Try replacing the 尝试更换

$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);

With

ob_start();
$time = microtime(true);
echo $content;
$echo_time = (microtime(true)-$time);
ob_clean();

This will echo to a buffer, but not actually spit it out via HTTP or whatever. 这将回显缓冲区,但实际上并没有通过HTTP或其他任何东西吐出来。 That should give you the 'real' time of the echo command without any concern sending out what's in the buffer. 这应该给你echo命令的“实际”时间,而不用担心发送缓冲区中的内容。

If echo_time shrinks down, you have a transport issue to address as best you can with buffering. 如果echo_time缩小,您可以通过缓冲尽可能地解决传输问题。

If echo_time is still to large, you'll need to start digging into the PHP C code. 如果echo_time仍然很大,你需要开始深入研究PHP C代码。

Either way you're a lot closer to finding your issue and a solution 无论哪种方式,您都可以更接近找到问题和解决方案

From http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why 来自http://wonko.com/post/seeing_poor_performance_using_phps_echo_statement_heres_why

This old bug report may shed some light. 这个旧的错误报告可能会有所启发。 In short, using echo to send large strings to the browser results in horrid performance due to the way Nagle's Algorithm causes data to be buffered for transmission over TCP/IP. 简而言之,使用echo将大字符串发送到浏览器会导致可怕的性能,因为Nagle算法会使数据被缓冲以通过TCP / IP进行传输。

The solution? 解决方案? A simple three-line function that splits large strings into smaller chunks before echoing them: 一个简单的三行函数,在回显它们之前将大字符串拆分成较小的块:

function echobig($string, $bufferSize = 8192) { 
    $splitString = str_split($string, $bufferSize);

    foreach($splitString as $chunk) { echo $chunk; }
}

Play around with the buffer size and see what works best for you. 使用缓冲区大小,看看哪种方式最适合您。 I found that 8192, apart from being a nice round number, seemed to be a good size. 我发现8192,除了是一个很好的圆形数字,似乎是一个很好的大小。 Certain other values work too, but I wasn't able to discern a pattern after several minutes of tinkering and there's obviously some math at work that I have no desire to try to figure out. 某些其他价值观也有效,但经过几分钟的修补后,我无法辨别出一种模式,显然有一些数学在工作,我不想试图弄明白。

By the way, the performance hit also happens when using PHP's output control functions (ob_start() and friends) 顺便说一下,当使用PHP的输出控制功能(ob_start()和朋友)时,性能也会发生。

Following the OPs comment that he's tried this I also found the following on PHP.net suggesting that str_split can also be a waste of resources and the echobig function can be optimised further by using the following code: 在OPs发表评论说他已经尝试了这个,我在PHP.net上发现了以下内容,表明str_split也可能浪费资源,并且可以通过使用以下代码进一步优化echobig功能:

function echobig($string, $bufferSize = 8192) {
  // suggest doing a test for Integer & positive bufferSize
  for ($chars=strlen($string)-1,$start=0;$start <= $chars;$start += $bufferSize) {
    echo substr($string,$start,$buffer_size);
  }
}

Have you tried running your script using the CLI rather than through Apache? 您是否尝试过使用CLI而不是Apache运行脚本?

You may be able to do this better using output buffers. 您可以使用输出缓冲区更好地完成此操作。 On a basic level, you use ob_start() to begin writing to an output buffer, and then ob_end_flush() to push it to the client. 在基本级别上,使用ob_start()开始写入输出缓冲区,然后使用ob_end_flush()将其推送到客户端。 Here is what php.net has to say about ob_start() : 这是php.net对ob_start()看法:

This function will turn output buffering on. 此功能将打开输出缓冲。 While output buffering is active no output is sent from the script (other than headers), instead the output is stored in an internal buffer. 当输出缓冲处于活动状态时,不会从脚本(标头除外)发送输出,而是将输出存储在内部缓冲区中。 The contents of this internal buffer may be copied into a string variable using ob_get_contents() . 可以使用ob_get_contents()将此内部缓冲区的内容复制到字符串变量中。 To output what is stored in the internal buffer, use ob_end_flush() . 要输出存储在内部缓冲区中的内容,请使用ob_end_flush()

I had the same problem in the past very similar to yours. 我过去遇到的问题和你的问题非常相似。 I found that this problem can be caused by slow clients. 我发现这个问题可能是由于客户端速度慢造成的。 If client fetched half of the page and then hangs, php will wait until client is ready and then sends rest of the content. 如果客户端获取了页面的一半然后挂起,则php将等待客户端准备就绪,然后发送其余内容。 So it could be not problem on your side. 所以这可能不是你的问题。

Update: 更新:

You can try following scripts on your server to check this. 您可以尝试在服务器上执行以下脚本来检查此问题。 This script put on your server and call it echo.php: 这个脚本放在你的服务器上并称之为echo.php:

<?php
$time_start = time();
echo str_repeat("a", 200000);
echo "\nThis script took: " . (time() - $time_start) . " sec";

Then fetch it with this script (change example.com to your domain): 然后使用此脚本获取它(将example.com更改为您的域):

<?php
$fp = fsockopen("example.com", 80, $errno, $errstr, 30);
if (!$fp) {
    echo "$errstr ($errno)<br />\n";
} else {
    $out = "GET /echo.php HTTP/1.1\r\n";
    $out .= "Host: example.com\r\n";
    $out .= "Connection: Close\r\n\r\n";
    fwrite($fp, $out);
    while (!feof($fp)) {
        echo fgets($fp, 5000);
        sleep(1);
    }
    fclose($fp);
}

I've got echo.php running 27 seconds. 我有echo.php运行27秒。 When I remove line sleep(1) , echo.php takes only 2 seconds to run. 当我删除行sleep(1)echo.php只需2秒即可运行。

As it is not possible to tell you the reason without knowing the body of your create_content() function I suggest you to add more "time logging" functions directly inside this function. 由于无法在不知道create_content()函数的主体的情况下告诉您原因,我建议您在此函数中直接添加更多“time logging”函数。 Making the included code fewer and fewer you will finally find the line that is causing the lag. 使包含的代码越来越少,最终会找到导致延迟的行。 Knowing the specific line will help you to understand the problem (database, machine load, connection problems to external services, ...). 了解特定行将有助于您了解问题(数据库,机器负载,与外部服务的连接问题,......)。

Do you have any while() or for() loops in your script? 你的脚本中是否有while()或for()循环? If so, you should check if these values aren't conflicting with anything, Occasionally I forgot about these myself and my script would also run for about 30 seconds. 如果是这样,你应该检查这些值是否与任何东西都没有冲突,有时我自己也忘了这些,我的脚本也会运行大约30秒。

My guess is that the act of accessing that large of a string takes up a decent amount of memory over multiple uses. 我的猜测是,访问大量字符串的行为在多次使用时会占用相当多的内存。 Because PHP is garbage collected, memory is taken up until the garbage collecter is ran, then it's freed. 因为PHP是垃圾收集的,所以在垃圾收集器运行之前会占用内存,然后释放它。 My guess is that multiple requests to store content in a string variable is causing the rapid filling up of volatile memory (RAM) . 我的猜测是,在字符串变量中存储内容的多个请求导致快速填充易失性存储器(RAM) Then a few times a day you start hitting the limit causing the slower load times. 然后每天几次开始达到极限,导致加载时间变慢。 Garbage collecter hits, and everything is back to normal. 垃圾收集器点击,一切都恢复正常。

If this is dedicated server - please login to console and see which process uses a lot of cpu time when you generating content. 如果这是专用服务器 - 请登录到控制台,看看在生成内容时哪个进程占用了大量的cpu时间。 Its very hard to tell, when we cant see code. 当我们看不到代码时,很难说清楚。 Maybe you just need some indexes in database, or maybe you should remove some indexes. 也许你只需要在数据库中使用一些索引,或者你应该删除一些索引。

You can also check httpd and mysqld log files. 您还可以检查httpd和mysqld日志文件。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM