简体   繁体   中英

Managing large log files

We have a crawler that persistently crawls our target sites, it's log files are turning out to be quite huge. Over 1 GB in some cases, I'm not too comfortable with deleting or overwriting them. Any examples of how you managed large log files?

Use a cron script to rotate the log files on a daily basis. Basically, you rename your logfile.log to logfile-YYYY-MM-DD.log . This way, instead of one huge logfile, you have smaller ones and are able to find logged messages from a certain time period easily. If you also compress your rotated logs, you will save even more disk space.

You can also use log4net, log4j, or other logging frameworks that automatically change the file names based on file size or date. http://logging.apache.org/log4net/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM