简体   繁体   English

ping多个ip并写入JSON文件python

[英]Ping multiple ips and write to JSON file python

I am pinging multiple ips in the LAN to check if it is alive or not. 我正在ping局域网中的多个ip,以检查其是否存在。 the code will be run every minute based on schedule.For pinging multiple ips, i used multiprocessing. 该代码将根据计划每分钟运行一次。对于ping多个IP,我使用了多重处理。 It is done great with the help of multiprocessing. 借助多处理,可以很好地完成此工作。 Simultaneously, i want to write the ping results to the json file after pinging. 同时,我想在ping后将ping结果写入json文件。 But When writing to the JSON file, it is writing only the output of the last ip.I want all the three. 但是当写入JSON文件时,它仅写入最后一个ip的输出。我想要所有这三个。 Is there any way to do this 有什么办法做到这一点

Here's the sample Code: 这是示例代码:

import json
from multiprocessing import Pool
import subprocess
from datetime import datetime
timestamp = datetime.now().strftime("%B %d %Y, %H:%M:%S")
hosts =  ["192.168.1.47","192.168.1.42"]
count = 1
wait_sec = 1
n = len(hosts)
def main(hosts):
    p = Pool(processes= n)
    result = p.map(beat, hosts)
def beat(hosts):
    #Name for the log file
    name = 'icmp.json'
    ip4write(hosts, name)
def ip4write(hosts, name):
    global ip4a
    ip4a = hosts
    ipve4(hosts, name)
    write(hosts, name)
def ipve4(hosts, name):
    global u
    status, result = subprocess.getstatusoutput("ping -c1 -w2 " + str(ip4a))
    if status == 0:
        print(str(ip4a) + " UP")
        u = " UP"
def write(hosts, name):
    text_file = open(name, "a+")
    with open(name) as json_file:
      try:
          data = json.load(json_file)
      except:
          data = {}
      with open(name, 'w') as outfile:
        data[timestamp] = {
          'monitor.ip':str(hosts),
          'monitor.status': u
        }
        print(data)
        json.dump(data, outfile)
        print('Data written')
    text_file.close()
main(hosts)

Output in JSON file: JSON文件中的输出:

{"February 15 2019, 16:38:12": {"monitor.status": " UP", "monitor.ip": "192.168.1.42"}}

My required Output: 我所需的输出:

{"February 15 2019, 16:38:12": {"monitor.ip": "192.168.1.47", "monitor.status": " UP"}, "February 15 2019, 16:38:12": {"monitor.ip": "192.168.1.42", "monitor.status": " UP"}}

To keep adding contents to an existing file without overwriting existing content, you should open in the "append" mode. 要在不覆盖现有内容的情况下继续向现有文件添加内容,应在“附加”模式下打开。 In your code, you are opening in "write" mode. 在您的代码中,您以“写入”模式打开。 Which will open the file for writing but will overwrite existing contents. 这将打开文件进行写入,但将覆盖现有内容。

Specifically, this line in your code: 具体来说,您的代码中的这一行:

with open(name, 'w') as outfile:

You should change the open mode from write ( 'w' ) to append ( 'a' ). 您应该将打开模式从write( 'w' )更改为append( 'a' )。

with open(name, 'a') as outfile:

Let me know if this solves your problem. 让我知道这是否可以解决您的问题。

Below is a compact version of the code: 下面是代码的精简版:

import os
from multiprocessing import Pool
import json
import datetime
import time

hosts = ["192.168.1.47", "8.8.8.8"]
MAX_NUMBER_OF_STATUS_CHECKS = 2
FILE_NAME = 'hosts_stats.json'


#
# counter and sleep were added in order to simulate scheduler activity  
#

def ping(host):
    status = os.system('ping  -o -c 3 {}'.format(host))
    return datetime.datetime.now().strftime("%B %d %Y, %H:%M:%S"), {"monitor.ip": host,
                                                                "monitor.status": 'UP' if status == 0 else 'DOWN'}


if __name__ == "__main__":
    p = Pool(processes=len(hosts))
    counter = 0
    if not os.path.exists(FILE_NAME):
        with open(FILE_NAME, 'w') as f:
            f.write('{}')
    while counter < MAX_NUMBER_OF_STATUS_CHECKS:
        result = p.map(ping, hosts)
        with open(FILE_NAME, 'rb+') as f:
            f.seek(-1, os.SEEK_END)
            f.truncate()
            for entry in result:
                _entry = '"{}":{},\n'.format(entry[0], json.dumps(entry[1]))
                f.writelines(_entry)
             f.write('}')
        counter += 1
        time.sleep(2)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM