简体   繁体   English

使用python将日志文件转换为json文件

[英]Convert log file into json file using python

I am new to python. 我是python的新手。 I am trying to convert the log file in to json file using python script. 我正在尝试使用python脚本将日志文件转换为json文件。 I created a main file and a del6 file. 我创建了一个主文件和一个del6文件。 Here, it will convert the log file and write into a new json file. 在这里,它将转换日志文件并将其写入新的json文件。 On execution, it shows me the following error. 执行时,它显示以下错误。

Traceback (most recent call last):
  File "main.py", line 23, in <module>
    main()
  File "main.py", line 14, in main
    print toJson(sys.argv[2])
  File "/home/paulsteven/BEAT/apache/del6.py", line 46, in toJson
    entries = readfile(file)
  File "/home/paulsteven/BEAT/apache/del6.py", line 21, in readfile
    filecontent[index] = line2dict(line)
  File "/home/paulsteven/BEAT/apache/del6.py", line 39, in line2dict
    res = m.groupdict()
AttributeError: 'NoneType' object has no attribute 'groupdict'

I tried this link log to json But it doesn't give me proper solution. 我尝试了将此链接日志链接到json,但没有给我适当的解决方案。 Is there any way to solve this. 有什么办法可以解决这个问题。

Here is my sample log file: 这是我的示例日志文件:

February 14 2019, 15:38:47      172.217.160.132     www.google.com      up      tcp-tcp@ www.google.com     172.217.160.132
February 14 2019, 15:38:47      104.28.4.86     www.smackcoders.com     up      tcp-tcp@ www.smackcoders.com        104.28.4.86     

The output should be like: 输出应类似于:

{"1": {"timestamp": "February 14 2019, 15:38:47", "monitorip": "172.217.160.132 ", "monitorhost": "www.google.com", "monitorstatus": "up", "monitorid": "tcp-tcp@ www.google.com", "resolveip": "172.217.160.132"}, "2": {"timestamp": "February 14 2019, 15:38:47", "monitorip": "104.28.4.86", "monitorhost": "www.smackcoders.com", "monitorstatus": "up", "monitorid": "tcp-tcp@ www.smackcoders.com", "resolveip": "104.28.4.86"}

Here is main python code: 这是主要的python代码:

import sys
from del6 import *

def main():
    if len(sys.argv) < 3:
        print "Incorrect Syntax. Usage: python main.py -f <filename>"
        sys.exit(2)
    elif sys.argv[1] != "-f":
        print "Invalid switch '"+sys.argv[1]+"'"
        sys.exit(2)
    elif os.path.isfile(sys.argv[2]) == False:
        print "File does not exist"
        sys.exit(2)
    print toJson(sys.argv[2])
    text_file = open("tcp.json", "a+")
    text_file.write(toJson(sys.argv[2]))
    text_file.write("\n")
    text_file.close()



if __name__ == "__main__":
    main()

Here's my del6 code: 这是我的del6代码:

import fileinput
import re
import os
try: import simplejson as json
except ImportError: import json

#read input file and return entries' Dict Object
def readfile(file):
    filecontent = {}
    index = 0
    #check necessary file size checking
    statinfo = os.stat(file)

    #just a guestimate. I believe a single entry contains atleast 150 chars
    if statinfo.st_size < 150:
        print "Not a valid access_log file. It does not have enough data"
    else:
        for line in fileinput.input(file):
            index = index+1
            if line != "\n": #don't read newlines
                filecontent[index] = line2dict(line)

        fileinput.close()
    return filecontent

#gets a line of string from Log and convert it into Dict Object
def line2dict(line):
    #Snippet, thanks to http://www.seehuhn.de/blog/52
    parts = [
    r'(?P<timestamp>\S+)',                  
    r'(?P<monitorip>\S+)',               
    r'(?P<monitorhost>\S+)',                
    r'(?P<monitorstatus>\S+)',              
    r'"(?P<monitorid>\S+)"',              
    r'(?P<resolveip>\S+)',             
]
    pattern = re.compile(r'\s+'.join(parts)+r'\s*\Z')
    m = pattern.match(line)
    res = m.groupdict()
    return res

#to get jSon of entire Log
#returns JSON object
def toJson(file):
    #get dict object for each entry
    entries = readfile(file)
    return json.JSONEncoder().encode(entries)

I see that columns divided by double tab. 我看到那列除以双标签。 So based on that: 因此基于:

i = 1
result = {}
with open('log.txt') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result[i] = {'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]}
        i += 1

Output: 输出:

{1: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, 2: {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}}

Or if you want to have list of dicts, which is more natural, then: 或者,如果您想拥有更自然的字典列表,则:

result = []
with open('log.txt') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result.append({'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]})

Output: 输出:

[{'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '172.217.160.132', 'monitorhost': 'www.google.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.google.com', 'resolveip': '172.217.160.132\n'}, {'timestamp': 'February 14 2019, 15:38:47', 'monitorip': '104.28.4.86', 'monitorhost': 'www.smackcoders.com', 'monitorstatus': 'up', 'monitorid': 'tcp-tcp@ www.smackcoders.com', 'resolveip': '104.28.4.86'}]

Thanks for the answer. 感谢您的回答。 To save it in a JSON file: 要将其保存在JSON文件中:

import json


i = 1
result = {}
with open('tcp.log') as f:
    lines = f.readlines()
    for line in lines:
        r = line.split('\t\t')
        result[i] = {'timestamp': r[0], 'monitorip': r[1], 'monitorhost': r[2], 'monitorstatus': r[3], 'monitorid': r[4], 'resolveip': r[5]}
        i += 1 
print(result) 
with open('data.json', 'w') as fp:
    json.dump(result, fp)

Below is a generic approach to the problem.The function 'log_lines_to_json' will handle any text file where the fields are separated by 'field_delimiter' and the field names are 'field_names' 下面是解决该问题的通用方法。函数'log_lines_to_json'将处理任何文本文件,其中字段由'field_delimiter'分隔并且字段名称为'field_names'

FIELD_NAMES = ['timestamp', 'monitorip', 'monitorhost', 'monitorstatus', 'monitorid', 'resolveip']
FIELD_DELIMITER = '\t\t'


def log_lines_to_json(log_file, field_names, field_delimiter):
    result = []
    with open(log_file) as f:
        lines = f.readlines()
        for line in lines:
            fields = line.split(field_delimiter)
            result.append({field_name: fields[idx] for idx, field_name in enumerate(field_names)})
    return result


entries = log_lines_to_json('log.txt', FIELD_NAMES, FIELD_DELIMITER)
for entry in entries:
    print(entry)

Output: 输出:

{'monitorid': 'tcp-tcp@ www.google.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.google.com', 'monitorip': '172.217.160.132', 'resolveip': '172.217.160.132\n'}
{'monitorid': 'tcp-tcp@ www.smackcoders.com', 'monitorstatus': 'up', 'timestamp': 'February 14 2019, 15:38:47', 'monitorhost': 'www.smackcoders.com', 'monitorip': '104.28.4.86', 'resolveip': '104.28.4.86'}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM