简体   繁体   中英

limit number of threads used by a child process launched with ``multiprocessing.Process`

I'm trying to launch a function ( my_function ) and stop its execution after a certain time is reached. So i challenged multiprocessing library and everything works well. Here is the code, where my_function() has been changed to only create a dummy message.

from multiprocessing import Queue, Process
from multiprocessing.queues import Empty
import time

timeout=1
# timeout=3


def my_function(something):
    time.sleep(2)
    return f'my message: {something}'

def wrapper(something, queue):
    message ="too late..."
    try:
        message = my_function(something)
        return message
    finally:
        queue.put(message)

try:
    queue = Queue()
    params = ("hello", queue)
    child_process = Process(target=wrapper, args=params)
    child_process.start()
    output = queue.get(timeout=timeout)
    print(f"ok: {output}")
except Empty:
    timeout_message = f"Timeout {timeout}s reached"
    print(timeout_message)
finally:
    if 'child_process' in locals():
        child_process.kill()

You can test and verify that depending on timeout=1 or timeout=3 , i can trigger an error or not.

My main problem is that the real my_function() is a torch model inference for which i would like to limit the number of threads (to 4 let's say)

One can easily do so if my_function were in the main process, but in my example i tried a lot of tricks to limit it in the child process without any success (using threadpoolctl.threadpool_limits(4) , torch.set_num_threads(4) , os.environ["OMP_NUM_THREADS"]=4 , os.environ["MKL_NUM_THREADS"]=4 ).

I'm completely open to other solution that can monitor the time execution of a function while limiting the number of threads used by this function.

thanks Regards

You can limit simultaneous process with Pool. ( https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.pool ) You can set max tasks done per child. Check it out.

Here you have a sample from superfastpython by Jason Brownlee:

  # SuperFastPython.com
  # example of limiting the number of tasks per child in the process pool
  from time import sleep
  from multiprocessing.pool import Pool
  from multiprocessing import current_process
  
  # task executed in a worker process
  def task(value):
      # get the current process
      process = current_process()
      # report a message
      print(f'Worker is {process.name} with {value}', flush=True)
      # block for a moment
      sleep(1)
  
  # protect the entry point
  if __name__ == '__main__':
      # create and configure the process pool
      with Pool(2, maxtasksperchild=3) as pool:
          # issue tasks to the process pool
          for i in range(10):
              pool.apply_async(task, args=(i,))
          # close the process pool
          pool.close()
          # wait for all tasks to complete
          pool.join()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM