简体   繁体   中英

How to debug "Symfony\Component\Debug\Exception\FatalErrorException" errors in PHP (Laravel)?

I am getting reports of many errors encountered by clients

Symfony\\Component\\Debug\\Exception\\FatalErrorException

Maximum execution time of 30 seconds exceeded

I, myself, cannot replicate it on my local machine nor on the production server. The URLs for this are all throughout the site, so, I guess it's something global, like a Middleware that's causing this.

I am using Sentry.io to collect the data, but the exception trace only has 1 entry that points to a certain code in the Symfony base code, most commonly:

vendor/symfony/finder/Iterator/ExcludeDirectoryFilterIterator.php at line 73

vendor/symfony/finder/Iterator/DateRangeFilterIterator.php at line 45

vendor/symfony/finder/Iterator/RecursiveDirectoryIterator.php at line 69

Clearly it seems there is something related to the file system, but due to no trace I cannot see where to look for the mistake in the site code. I would guess it is some kind of infinite loop or leak, but there is no trace to look at it, and no consistent way to reproduce the problem.

How should I be looking for the problem and debugging this?

Are there any settings I could set, or tools I could use/enable?

After reading your chat conversation, I saw that you're using this .env configuration:

CACHE_DRIVER=file 
SESSION_DRIVER=file 

I think this is the problem... I explain myself a little better.

When you use the file driver for the cache or the session, Laravel will create tons of files that stores users session data or application cache data...

If your e-commerce is growing and generating a lot of traffic, then it may be possible that the performance are slowing down because of this tons of files that has to be scanned by the framework.

I think that may be two possible solutions:

  • Your production environment has to be upgraded (I don't know your production server specs or if you have enough resources).
  • The file driver it's becoming too slow for your application requirements.

I usually use redis as cache and session driver, it's faster and with a good strategy for "smart caching" it's a great tool.

I think you should try to use it aswell if possibile. Memcached may be a good solution too.

If you are not sure about the reason of the exception then you can handle it in two ways

1 increase request timeout ini_set('max_execution_time', 60); //60 seconds = 1 minute

2 wrap your code in try catch

try{
  //logic goes here
}catch(\Excaption $e){
 Log::error($e->getMessage().' '. $e->getFile().' '. $e->getLine());
 return back()->with('error',$e->getMessage() );
}

Can you register a shutdown function? The shutdown function is called even when a timeout occurs. With it you can print or save the what you want to a log file. I'm not sure if there is a better way to get the backtrace in laravel, but that's how I'd probably do in pure php (calling debug_backtrace).

<?php

function timedOut() {
    //save to a log file instead of printing
    var_dump(debug_backtrace());
}

register_shutdown_function("timedOut");

http://php.net/manual/en/function.register-shutdown-function.php

http://php.net/manual/en/function.debug-backtrace.php

I am not used to Laravel but I have had this issue where I solved it by using PHP's register_shutdown_function.

I have found it to be very useful in tracking errors which occur randomly. This is how I do this in my code. You could put this somewhere in a common file that would execute on every page, index.php would be a good option for you as all Laravel routes go through it (my assumption).

register_shutdown_function( "check_for_fatal" );

function check_for_fatal(){
    $time = time(); //time when this error occurred

    $error = error_get_last();
    if (in_array($error["type"], [E_ERROR, E_CORE_ERROR, E_RECOVERABLE_ERROR])){
        $email_body = [];
        $email_body[] = 'Date: ' . date('m-d-Y H:i:s', $time);
        ob_start();
        var_dump($error);
        $email_body[] = ob_get_clean();
        //include any other data as needed
        //$body[] = "add data as appropriate";

        //You can email it to yourself, but if there are lots of errors you will be bombarded with emails
        mail('your_email_address@example.com', 'Subject', implode("\r\n", $email_body));
       //or you can save this to some log file
    }
}

It looks like PHP is waiting for some resource, eg. file access, database, mail server (I think file).

  • Did you try using your app on many tabs with one session?
  • Did you try login to same account from many machines?
  • Maybe some part of script is opening file and not closing it?
  • Did you track user actions from opening site to get this error?
  • Check your production database - maybe have very small limit of connections?

EDIT

I see that you are using dannyvankooten/vat.php library which makes some requests to external services. This can be a source of your problems. This library is making requests using curl. Author is setting CURLOPT_CONNECTTIMEOUT but CURLOPT_TIMEOUT is not set and your script sometimes can wait longer than it is limited by max_execution_time setting.

I don't think you should increase the timeout just yet. if you do the "try catch" you might get an understanding of the underlying problem. If not, check the operations / functions performed in the specific method or class. There is a chance that you are querying a huge table and maybe trying to use the information.

if you do

\DB::listen(function ($sql) {
        var_dump($sql);
   });

this will give you an indication of how many queries are running for the operation

There is no way to catch this in a try catch as it's actually a PHP error rather than an exception.

To be able to debug this, you have a couple of options:

  1. Add some logs in the code to identify where it times out
  2. You can use the Laravel dump server package to dump out logs. This is actually going to be shipped with Laravel 5.7 but you can always add the package for now

I installed laravel-debugbar.I think it help you.

composer require barryvdh/laravel-debugbar Next open config/app.php and inside the 'providers' array add:

Barryvdh\Debugbar\ServiceProvider::class,

in alias array class:

'Debugbar' => Barryvdh\Debugbar\Facade::class,

and you can view

Debugbar::measure('My long operation', function() {

// Do something… });

You can not catch php timeout error. This error occur when php interpreter stop execution. You can only increase time limit eg ini_set('max_execution_time', 300) or convert long execution work to a cron job eg laravel task schedule.

A New sentry version will give you a proper stack trace.

You must use "getsentry/sentry-php" version >= "2.0"

In all honesty, your best call is to install xdebug and just debug it old school way, go through the whole request to find a bottleneck and try to figure out where does it come from. Laravel, as well as other frameworks, were designed to run as smoothly as possible. If you're facing any errors of this kind it means you might have just code something incorrectly.

It's also difficult to give you a specific advice without more information. What I would propose is it recreate the specs of the environment (you can use Docker or Vagrant or whatever comes to your mind that would work) that your Laravel app is facing and then run with xdebug to find where the problem lies.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM