简体   繁体   中英

Time out decorator on a multprocessing function

I have this decorator taken directly from an example I found on the net:

class TimedOutExc(Exception):
    pass


def timeout(timeout):
    def decorate(f):
        def handler(signum, frame):
            raise TimedOutExc()

        def new_f(*args, **kwargs):

            old = signal.signal(signal.SIGALRM, handler)
            signal.alarm(timeout)

            try:
                result = f(*args, **kwargs)
            except TimedOutExc:
                return None
            finally:
                signal.signal(signal.SIGALRM, old)
            signal.alarm(0)
            return result

        new_f.func_name = f.func_name
        return new_f

    return decorate

It throws an exception if the f function times out.

Well, it works but when I use this decorator on a multiprocessing function and stops due to a time out, it doesn't terminate the processes involved in the computation. How can I do that?

I don't want to launch an exception and stop the program. Basically what I want is when f times out, have it return None and then terminate the processes involved.

While I agree with the main point of Aaron's answer, I would like to elaborate a bit.

The processes launched by multiprocessing must be stopped in the function to be decorated ; I don't think that this can be done generally and simply from the decorator itself (the decorated function is the only entity that knows what calculations it launched).

Instead of having the decorated function catch SIGALARM , you can also catch your custom TimedOutExc exception–this might be more flexible. Your example would then become:

import signal
import functools

class TimedOutExc(Exception):
    """
    Raised when a timeout happens
    """

def timeout(timeout):
    """
    Return a decorator that raises a TimedOutExc exception
    after timeout seconds, if the decorated function did not return.
    """

    def decorate(f):

        def handler(signum, frame):
            raise TimedOutExc()

        @functools.wraps(f)  # Preserves the documentation, name, etc.
        def new_f(*args, **kwargs):

            old_handler = signal.signal(signal.SIGALRM, handler)
            signal.alarm(timeout)

            result = f(*args, **kwargs)  # f() always returns, in this scheme

            signal.signal(signal.SIGALRM, old_handler)  # Old signal handler is restored
            signal.alarm(0)  # Alarm removed

            return result

        return new_f

    return decorate

@timeout(10)
def function_that_takes_a_long_time():
    try:
        # ... long, parallel calculation ...
    except TimedOutExc:
        # ... Code that shuts down the processes ...
        # ...
        return None  # Or exception raised, which means that the calculation is not complete

I doubt that can be done with a decorator: A decorator is a wrapper for a function; the function is a black box. There is no communication between the decorator and the function it wraps.

What you need to do is rewrite your function's code to use the SIGALRM handler to terminate any processes it has started.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM