简体   繁体   中英

Bash grep php error_log Fatal errors to file by filter


I have webserver with several web applications running (PHP). The configuration is set up the way, that php error_log is deleted at midnight (so the file lasts for 24 hrs only).

The thing is, I would like to log all Fatal errors into another file or database but only for specific web applications. (there are about 20 running, 4 of them are mine).

I was thinking about creating a bash script, grepping the error_log file on "Fatal" and url of my applications, fetching the output to file and also remembering last line number of current error_log in separate cache file.

I would then put the script to cron and execute it every few minutes (starting at the last line of the previous run).

The whole idea is a little messed up and I think it could be written efficiently, any ideas?

Writing a cron job seems OK if you can't configure this out of the box. I don't know PHP well enough. In java eg you can have the same log message go to several log files depending on criteria.

But I would have your cron job do both the collecting of the fatal errors AND the deletion of the "last day's" log file. This way you can suffice with a single run of this script, at midnight, and save yourself the complexity of knowing where you ended last time (and the chance you missed some errors that happened right before midnight). If the collection was OK, delete the old file, otherwise leave it for diagnosis and retry. It saves you a bunch (24*60) calls to the script.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM