简体   繁体   中英

Python return tuple in threading

I got stuck with a sticky problem with threading in python. Here is what I have. At first, I have a decorator which I used for every function to get two values, time and result. After I get those numbers I used python logging to store time. Now I want to run two functions at the same time and get those values. Which is the simplest way to do that?

Decorator:

def my_dec(func):
    def wrapper(*args):
        start_time = datetime.datetime.now().replace(microsecond=0)
        result = func(*arg)
        end_time  = datetime.datetime.now().replace(microsecond=0)
        time = end_time - start_time
        return time, result
    return wrapper

Function1:

@my_dec
def get_taskbar_tooltip_message(timeout=10):
    #This function calls another functions which waits and return value at a given timeout
    #Returns string

Function2:

@my_dec
def check_pis(timer=200, status="start"):
     #This function also calls other functions, which takes some time
     #Returns bool

Now I want to run two functions at the same time and get those values.

You could start by trying out the high level tools from the standard library's concurrent.futures module.

Multiprocessing is the real parallel processing:

import time
from functools import wraps
from concurrent.futures import ProcessPoolExecutor as Executor

def my_dec(func):
    @wraps(func)
    def wrapper(*args, **kwargs):
        start = time.perf_counter()
        result = func(*args, **kwargs)
        end = time.perf_counter()
        return end - start, result
    return wrapper

@my_dec
def get_taskbar_tooltip_message(timeout=10):
    time.sleep(3)
    return "a string"

@my_dec
def check_pis(timer=200, status="start"):
    time.sleep(5)
    return True

if __name__ == "__main__":

    with Executor() as exec:
        start = time.perf_counter()
        result_1 = exec.submit(get_taskbar_tooltip_message)
        result_2 = exec.submit(check_pis)
        print(result_1.result(), result_2.result())
        end = time.perf_counter()
        print(end - start)

A couple of remarks:

  • I replaced datetime.datetime.now() with time.perf_counter() which seems better suited to the task at hand.
  • As pointed out by @Booboo: Using wraps is good practice. And it's rather important here: Without it the multiprocessing would fail (due to an unpickling error).
  • I have enlarged the wrapper signature with **kwargs in case of keyword arguments (also pointed out by @Booboo).

Threading looks similar, just use another type of Executor :

import time
from functools import wraps
from concurrent.futures import ThreadPoolExecutor as Executor

...

with Executor() as exec:
    start = time.perf_counter()
    result_1 = exec.submit(get_taskbar_tooltip_message)
    result_2 = exec.submit(check_pis)
    print(result_1.result(), result_2.result())
    end = time.perf_counter()
    print(end - start)

Here the if __name__ == "__main__": isn't necessary. Threading doesn't break out of the current process, so it's not real parallel, but it is still extremely useful.

What kind of concurrency approach to choose depends on your use case: Multiprocessing is better for cpu-heavy tasks, whereas threading might be better for i/o-heavy tasks. (Python also offers more lower level interfaces for both, multiprocessing and threading .) If you're looking at i/o-heavy tasks then there's another approach available, asyncio , which might be an alternative: There's a bit of a learning curve, and you might have to restructure your functions, but it might be worth the effort.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM