简体   繁体   中英

How to analyze execution times from the log?

I'm using log4j with the %d ... conversion pattern, which makes every log message begin with a timestamp like so: 2011-06-26 14:34:16,357 . I log each SQL query I submit.

I would like to analyze deltas between SQL queries, and maybe even aggregate multiple execution of the exact same SQL query for max-time and average-time..

How would you approach this? using grep and some excel work? Is there some common way/tool/script that would make my life easy?

PS to make things more annoying, my SQLs are multi-lines, so log4jdbc sqltiming logger prints them like so:

2011-06-26 14:43:32,112 [SelectCampaignTask ] INFO : jdbc.sqltiming - CREATE INDEX idx ON tab CRLF
USING btree (id1, id2, emf);  {executed in 34788 msec}

I just solved the same issue by writing down a small script in Python. I am a totally newbie of Python and I was able to get it working in less than a couple of hours.

Here are the key parts of my code:

import re
logfile = open("jdbcPerf.log", "r").readlines()
#extract the interesting lines
for line in logfile:
    m= re.search('^((\d+)-(\d+)-(\d+)) | ({executed )', line)
    if m:
        print m.group()
#extract name of servlet and execution time
for line in selectedLines:
    #extract servlet name
    m = re.search('servlets.([a-zA-Z]*).([a-zA-Z]*)', line)
    if m:
         print m.group()            
    #extract execution time
    m = re.search('( \d+ )',line)
    if m:
         print m.group()

You can use this as a skeleton to then do whatever data aggregation you need.

My log file looks like this:

2013-05-26 08:22:10,583 DEBUG [jdbc.sqltiming]   
com.myclass.servlets.BrowseCategories.categoryList(null:-1)
16. select category0_.id as id, category0_.name as name from categories category0_
{executed in 7 msec}

LogMX is a log viewer tool that can export any log file to CSV, while parsing the date and handling multi-line log events. You can also (in its GUI) compute the time elapsed between several log events.

To do so, you first need to describe (in LogMX) your log format using a Log4j pattern or a regular expression.

PS: you can export log files from command line using this tool (console mode provided).

I would be tempted to write a Groovy/Perl/Python script to pick apart the logs using a regular expression.

If you dump the output to CSV you can certainly use Excel to data mine.

An alternative would be to write the DateTime, thread, category level and the log message to a database table. Writing a SQL query to write reports is a really easy way of generating custom reports wrt time ranges, like filters etc.

Mining log files seems to be a rite of passage for most developers and is often a good time to learn a scripting language...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM