简体   繁体   中英

Logstash can't (read and) store log data from a file after python write new data into it

I used a python script to write log to a special file automatically, which file is also the source log of my log-stash. The context can been written correctly (by python scrpit), but after that log-stash can't read log data any more, and can't work even restart the logstash.

The main steps are:

  1. start elasticsearch(with default conf) and logstash(with conf showed bellow) the logstash can store all the data automatically (in the source log file) to the elasticsearch, and output the info to console.
  2. start the python script(showed bellow) to write json data into the source log file. Data is successful been written. but the logstash can't read new data any more. even I restart the logstash, it still can't read data in the source log or store it to elasticsearch.

Has anyone met this problem before? Here is my python code:

 def store(filepath,data):
  with open(filepath, 'a') as json_file:
    json_file.write(json.dumps(data))
    json_file.write("\r")
    # json_file.close

def load(filepath):
  with open(filepath) as json_file:

    data = json.load(json_file)
    return data


if __name__ == "__main__":

data = {}
sourceFilePath = "elk_data_source.log"
destFilePath = "elk_data_dest2.log"
for i in range(1,20):
    data= load(sourceFilePath)
data["eventType"] = "*ABC"
    store(destFilePath, data)

read = open(destFilePath)
line=read.readline()  
while line:
print line 
    ''' 
context = json.loads(line)
context = context.join([ string.strip().rsplit("}" , 1)[0] ,  "}"] )
print context  
'''
line=read.readline()  
 # read.close 
 read.close() 

Here is my logstash conf file,which can work correctly if manually input data into this log:

input {
       file {
              type => "accounts"
              path => "/ELK_Data/elk_data_dest2.log"

              start_position => "beginning"

            }
      }

filter {

        json{
              source => "message"
            }
       }

output {
       stdout { codec=> rubydebug }
       elasticsearch {
            hosts => "localhost:9200"
            index => "logstash-2016.12.20"
                     }
      } 

Here is my elk_data_source.log

  {"eventType": ["*icbc"], "prodName": ["LDAP"], "prodFmid": ["HRSL420"], "systemSmfid": ["EIMG"]}

versions: logstash-5.0.0 elasticsearch-2.4.1 Python 2.7.6

我不完全理解您的问题,但是,即使在第一个函数json_file.close() ,最后一行也不应该是read.close()而不是read.close ,如果必须与with一起使用,这实际上是多余的

From what I gather, source file is elk_data_source.log and you are trying to overwrite "eventType" = ["*icbc"] to *ABC . But what you missed is.. value of "eventType" is an array and you are writing it with a single value instead -- *ABC .

Change data["eventType"] = "*ABC" to data["eventType"] = ["*ABC"] .

This should solve. If possible try to compare both the files with file comparison softwares. Also, do check the matching braces or additional space that might be causing issues while reading and formatting from files.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM