简体   繁体   中英

Python Watchdog Subprocess Queue

I have copied a Python watchdog script from the following website: https://www.michaelcho.me/article/using-pythons-watchdog-to-monitor-changes-to-a-directory

import time
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler


class Watcher:
    DIRECTORY_TO_WATCH = "/path/to/my/directory"

    def __init__(self):
        self.observer = Observer()

    def run(self):
        event_handler = Handler()
        self.observer.schedule(event_handler, self.DIRECTORY_TO_WATCH, recursive=True)
        self.observer.start()
        try:
            while True:
                time.sleep(5)
        except:
            self.observer.stop()
            print "Error"

        self.observer.join()


class Handler(FileSystemEventHandler):

    @staticmethod
    def on_any_event(event):
        if event.is_directory:
            return None

        elif event.event_type == 'created':
            # Take any action here when a file is first created.
            print "Received created event - %s." % event.src_path
            # Build up queue of subtasks here and let another thread/process 
            # take care of it so that main process can continue.

        elif event.event_type == 'modified':
            # Taken any action here when a file is modified.
            print "Received modified event - %s." % event.src_path


if __name__ == '__main__':
    w = Watcher()
    w.run()

This script works great for me, but I have some additional requirements.

  1. Instead of printing text, I would like to start an additional process (Python script) that can take a number of minutes. The main script should not wait for this process to finish but instead keep checking for new or changed files.

  2. The secondary processes that are kicked off are not allowed to overtake each other, so have to be placed in some kind of queue that needs to be processed in series.

What method/package is a good way to tackle these requirements? I have briefly looked at multiprocessing and asyncio, but am unsure about a correct implementation.

My general idea is that a separate process/thread should be launched upon an event type, that checks the queue and goes through one by one. Ideally this secondary thread/process finishes the remaining queue when the main process is closed down.

I used this template for watchdog. on_any_event is a little to sensitive in my case.

Answer for 1.: You can put whatever in place of print . A function, method, a loop, etc. Watcher will continue to run and call Handler() when an event happens.

Think you need to elaborate more what you wanna do after on_any_event is called.

class Handler(FileSystemEventHandler):

@staticmethod
def on_any_event(event):
    if event.is_directory:
        return None

    elif event.event_type == 'created':
        return run_function()

    elif event.event_type == 'modified':
        return run_other_function()

def run_function():
    W = '36 Chambers\n'
    with open('Wu-Tang.txt', '') as wu:
        wu.write(W) 
        wu.close()

def run_other_function():
    _W = 'The RZA the GZA.\n'
    with open('Wu-Tang.txt', 'a') as _wu:
        _wu.write(_W) 
        _wu.close()



if __name__ == '__main__':
    w = Watcher()
    w.run()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM