简体   繁体   English

如何使用具有多个参数的多处理 pool.map

[英]How to use multiprocessing pool.map with multiple arguments

In the Python multiprocessing library, is there a variant of pool.map which supports multiple arguments?在 Python multiprocessing库中,是否有支持多个参数的pool.map变体?

import multiprocessing

text = "test"

def harvester(text, case):
    X = case[0]
    text + str(X)

if __name__ == '__main__':
    pool = multiprocessing.Pool(processes=6)
    case = RAW_DATASET
    pool.map(harvester(text, case), case, 1)
    pool.close()
    pool.join()

is there a variant of pool.map which support multiple arguments?是否有支持多个参数的 pool.map 变体?

Python 3.3 includes pool.starmap() method : Python 3.3 包括pool.starmap()方法

#!/usr/bin/env python3
from functools import partial
from itertools import repeat
from multiprocessing import Pool, freeze_support

def func(a, b):
    return a + b

def main():
    a_args = [1,2,3]
    second_arg = 1
    with Pool() as pool:
        L = pool.starmap(func, [(1, 1), (2, 1), (3, 1)])
        M = pool.starmap(func, zip(a_args, repeat(second_arg)))
        N = pool.map(partial(func, b=second_arg), a_args)
        assert L == M == N

if __name__=="__main__":
    freeze_support()
    main()

For older versions:对于旧版本:

#!/usr/bin/env python2
import itertools
from multiprocessing import Pool, freeze_support

def func(a, b):
    print a, b

def func_star(a_b):
    """Convert `f([1,2])` to `f(1,2)` call."""
    return func(*a_b)

def main():
    pool = Pool()
    a_args = [1,2,3]
    second_arg = 1
    pool.map(func_star, itertools.izip(a_args, itertools.repeat(second_arg)))

if __name__=="__main__":
    freeze_support()
    main()

Output输出

1 1
2 1
3 1

Notice how itertools.izip() and itertools.repeat() are used here.注意这里是如何使用itertools.izip()itertools.repeat()的。

Due to the bug mentioned by @unutbu you can't usefunctools.partial() or similar capabilities on Python 2.6, so the simple wrapper function func_star() should be defined explicitly.由于@unutbu 提到的错误,您不能在 Python 2.6 上使用functools.partial()或类似功能,因此应明确定义简单的包装函数func_star() See also the workaround suggested by uptimebox .另请参阅uptimebox建议的解决方法

The answer to this is version- and situation-dependent.对此的答案取决于版本和情况。 The most general answer for recent versions of Python (since 3.3) was first described below by JF Sebastian . JF Sebastian在下面首先描述了最近版本的 Python(自 3.3 起)的最通用答案。 1 It uses the Pool.starmap method, which accepts a sequence of argument tuples. 1它使用Pool.starmap方法,该方法接受参数元组序列。 It then automatically unpacks the arguments from each tuple and passes them to the given function:然后它会自动从每个元组中解压缩参数并将它们传递给给定的函数:

import multiprocessing
from itertools import product

def merge_names(a, b):
    return '{} & {}'.format(a, b)

if __name__ == '__main__':
    names = ['Brown', 'Wilson', 'Bartlett', 'Rivera', 'Molloy', 'Opie']
    with multiprocessing.Pool(processes=3) as pool:
        results = pool.starmap(merge_names, product(names, repeat=2))
    print(results)

# Output: ['Brown & Brown', 'Brown & Wilson', 'Brown & Bartlett', ...

For earlier versions of Python, you'll need to write a helper function to unpack the arguments explicitly.对于早期版本的 Python,您需要编写一个辅助函数来显式解压缩参数。 If you want to use with , you'll also need to write a wrapper to turn Pool into a context manager.如果您想使用with ,您还需要编写一个包装器来将Pool变成一个上下文管理器。 (Thanks to muon for pointing this out.) (感谢muon指出这一点。)

import multiprocessing
from itertools import product
from contextlib import contextmanager

def merge_names(a, b):
    return '{} & {}'.format(a, b)

def merge_names_unpack(args):
    return merge_names(*args)

@contextmanager
def poolcontext(*args, **kwargs):
    pool = multiprocessing.Pool(*args, **kwargs)
    yield pool
    pool.terminate()

if __name__ == '__main__':
    names = ['Brown', 'Wilson', 'Bartlett', 'Rivera', 'Molloy', 'Opie']
    with poolcontext(processes=3) as pool:
        results = pool.map(merge_names_unpack, product(names, repeat=2))
    print(results)

# Output: ['Brown & Brown', 'Brown & Wilson', 'Brown & Bartlett', ...

In simpler cases, with a fixed second argument, you can also use partial , but only in Python 2.7+.在更简单的情况下,使用固定的第二个参数,您也可以使用partial ,但仅限于 Python 2.7+。

import multiprocessing
from functools import partial
from contextlib import contextmanager

@contextmanager
def poolcontext(*args, **kwargs):
    pool = multiprocessing.Pool(*args, **kwargs)
    yield pool
    pool.terminate()

def merge_names(a, b):
    return '{} & {}'.format(a, b)

if __name__ == '__main__':
    names = ['Brown', 'Wilson', 'Bartlett', 'Rivera', 'Molloy', 'Opie']
    with poolcontext(processes=3) as pool:
        results = pool.map(partial(merge_names, b='Sons'), names)
    print(results)

# Output: ['Brown & Sons', 'Wilson & Sons', 'Bartlett & Sons', ...

1. Much of this was inspired by his answer, which should probably have been accepted instead. 1. 这很大程度上是受到他的回答的启发,他的回答可能应该被接受。 But since this one is stuck at the top, it seemed best to improve it for future readers.但由于这个被卡在顶部,似乎最好为未来的读者改进它。

I think the below will be better我认为下面会更好

def multi_run_wrapper(args):
   return add(*args)
def add(x,y):
    return x+y
if __name__ == "__main__":
    from multiprocessing import Pool
    pool = Pool(4)
    results = pool.map(multi_run_wrapper,[(1,2),(2,3),(3,4)])
    print results

output输出

[3, 5, 7]

Using Python 3.3+ with pool.starmap():Python 3.3+pool.starmap():

from multiprocessing.dummy import Pool as ThreadPool 

def write(i, x):
    print(i, "---", x)

a = ["1","2","3"]
b = ["4","5","6"] 

pool = ThreadPool(2)
pool.starmap(write, zip(a,b)) 
pool.close() 
pool.join()

Result:结果:

1 --- 4
2 --- 5
3 --- 6

You can also zip() more arguments if you like: zip(a,b,c,d,e)如果您愿意,您还可以 zip() 更多参数: zip(a,b,c,d,e)

In case you want to have a constant value passed as an argument:如果您想将常量值作为参数传递:

import itertools

zip(itertools.repeat(constant), a)

In case your function should return something:如果你的函数应该返回一些东西:

results = pool.starmap(write, zip(a,b))

This gives a List with the returned values.这给出了一个带有返回值的列表。

How to take multiple arguments:如何接受多个参数:

def f1(args):
    a, b, c = args[0] , args[1] , args[2]
    return a+b+c

if __name__ == "__main__":
    import multiprocessing
    pool = multiprocessing.Pool(4) 

    result1 = pool.map(f1, [ [1,2,3] ])
    print(result1)

Having learnt about itertools in JF Sebastian answer I decided to take it a step further and write a parmap package that takes care about parallelization, offering map and starmap functions on python-2.7 and python-3.2 (and later also) that can take any number of positional arguments.JF Sebastian 的回答中了解了 itertools 后,我决定更进一步,编写一个parmap包来处理并行化,在 python-2.7 和 python-3.2(以及更高版本)上提供mapstarmap函数,可以采用任何数字的位置参数。

Installation安装

pip install parmap

How to parallelize:如何并行化:

import parmap
# If you want to do:
y = [myfunction(x, argument1, argument2) for x in mylist]
# In parallel:
y = parmap.map(myfunction, mylist, argument1, argument2)

# If you want to do:
z = [myfunction(x, y, argument1, argument2) for (x,y) in mylist]
# In parallel:
z = parmap.starmap(myfunction, mylist, argument1, argument2)

# If you want to do:
listx = [1, 2, 3, 4, 5, 6]
listy = [2, 3, 4, 5, 6, 7]
param = 3.14
param2 = 42
listz = []
for (x, y) in zip(listx, listy):
        listz.append(myfunction(x, y, param1, param2))
# In parallel:
listz = parmap.starmap(myfunction, zip(listx, listy), param1, param2)

I have uploaded parmap to PyPI and to a github repository .我已将 parmap 上传到 PyPI 和github 存储库

As an example, the question can be answered as follows:例如,这个问题可以回答如下:

import parmap

def harvester(case, text):
    X = case[0]
    text+ str(X)

if __name__ == "__main__":
    case = RAW_DATASET  # assuming this is an iterable
    parmap.map(harvester, case, "test", chunksize=1)

There's a fork of multiprocessing called pathos ( note: use the version on github ) that doesn't need starmap -- the map functions mirror the API for python's map, thus map can take multiple arguments.有一个叉multiprocessing称为悲怆注:使用GitHub上的版本)不需要starmap -地图功能反映了Python的地图API的,这样可以映射多个参数。 With pathos , you can also generally do multiprocessing in the interpreter, instead of being stuck in the __main__ block.使用pathos ,您通常还可以在解释器中进行多处理,而不是卡在__main__块中。 Pathos is due for a release, after some mild updating -- mostly conversion to python 3.x.经过一些温和的更新后,Pathos 将发布 - 主要转换为 python 3.x。

  Python 2.7.5 (default, Sep 30 2013, 20:15:49) 
  [GCC 4.2.1 (Apple Inc. build 5566)] on darwin
  Type "help", "copyright", "credits" or "license" for more information.
  >>> def func(a,b):
  ...     print a,b
  ...
  >>>
  >>> from pathos.multiprocessing import ProcessingPool    
  >>> pool = ProcessingPool(nodes=4)
  >>> pool.map(func, [1,2,3], [1,1,1])
  1 1
  2 1
  3 1
  [None, None, None]
  >>>
  >>> # also can pickle stuff like lambdas 
  >>> result = pool.map(lambda x: x**2, range(10))
  >>> result
  [0, 1, 4, 9, 16, 25, 36, 49, 64, 81]
  >>>
  >>> # also does asynchronous map
  >>> result = pool.amap(pow, [1,2,3], [4,5,6])
  >>> result.get()
  [1, 32, 729]
  >>>
  >>> # or can return a map iterator
  >>> result = pool.imap(pow, [1,2,3], [4,5,6])
  >>> result
  <processing.pool.IMapIterator object at 0x110c2ffd0>
  >>> list(result)
  [1, 32, 729]

pathos has several ways that that you can get the exact behavior of starmap . pathos有几种方法可以让您获得starmap的确切行为。

>>> def add(*x):
...   return sum(x)
... 
>>> x = [[1,2,3],[4,5,6]]
>>> import pathos
>>> import numpy as np
>>> # use ProcessPool's map and transposing the inputs
>>> pp = pathos.pools.ProcessPool()
>>> pp.map(add, *np.array(x).T)
[6, 15]
>>> # use ProcessPool's map and a lambda to apply the star
>>> pp.map(lambda x: add(*x), x)
[6, 15]
>>> # use a _ProcessPool, which has starmap
>>> _pp = pathos.pools._ProcessPool()
>>> _pp.starmap(add, x)
[6, 15]
>>> 

A better solution for python2: python2的一个更好的解决方案:

from multiprocessing import Pool
def func((i, (a, b))):
    print i, a, b
    return a + b
pool = Pool(3)
pool.map(func, [(0,(1,2)), (1,(2,3)), (2,(3, 4))])

2 3 4 2 3 4

1 2 3 1 2 3

0 1 2 0 1 2

out[]:出去[]:

[3, 5, 7] [3, 5, 7]

You can use the following two functions so as to avoid writing a wrapper for each new function:您可以使用以下两个函数以避免为每个新函数编写包装器:

import itertools
from multiprocessing import Pool

def universal_worker(input_pair):
    function, args = input_pair
    return function(*args)

def pool_args(function, *args):
    return zip(itertools.repeat(function), zip(*args))

Use the function function with the lists of arguments arg_0 , arg_1 and arg_2 as follows:将函数function与参数列表arg_0arg_1arg_2 ,如下所示:

pool = Pool(n_core)
list_model = pool.map(universal_worker, pool_args(function, arg_0, arg_1, arg_2)
pool.close()
pool.join()

Another simple alternative is to wrap your function parameters in a tuple and then wrap the parameters that should be passed in tuples as well.另一种简单的替代方法是将您的函数参数包装在一个元组中,然后将应该在元组中传递的参数也包装起来。 This is perhaps not ideal when dealing with large pieces of data.在处理大量数据时,这可能并不理想。 I believe it would make copies for each tuple.我相信它会为每个元组制作副本。

from multiprocessing import Pool

def f((a,b,c,d)):
    print a,b,c,d
    return a + b + c +d

if __name__ == '__main__':
    p = Pool(10)
    data = [(i+0,i+1,i+2,i+3) for i in xrange(10)]
    print(p.map(f, data))
    p.close()
    p.join()

Gives the output in some random order:以某种随机顺序给出输出:

0 1 2 3
1 2 3 4
2 3 4 5
3 4 5 6
4 5 6 7
5 6 7 8
7 8 9 10
6 7 8 9
8 9 10 11
9 10 11 12
[6, 10, 14, 18, 22, 26, 30, 34, 38, 42]

A better way is using decorator instead of writing wrapper function by hand.更好的方法是使用装饰器而不是手动编写包装器函数 Especially when you have a lot of functions to map, decorator will save your time by avoiding writing wrapper for every function.特别是当你有很多函数要映射时,装饰器将通过避免为每个函数编写包装器来节省你的时间。 Usually a decorated function is not picklable, however we may use functools to get around it.通常装饰函数是不可picklable的,但是我们可以使用functools来绕过它。 More disscusions can be found here .更多讨论可以在这里找到。

Here the example这里的例子

def unpack_args(func):
    from functools import wraps
    @wraps(func)
    def wrapper(args):
        if isinstance(args, dict):
            return func(**args)
        else:
            return func(*args)
    return wrapper

@unpack_args
def func(x, y):
    return x + y

Then you may map it with zipped arguments然后你可以用压缩的参数映射它

np, xlist, ylist = 2, range(10), range(10)
pool = Pool(np)
res = pool.map(func, zip(xlist, ylist))
pool.close()
pool.join()

Of course, you may always use Pool.starmap in Python 3 (>=3.3) as mentioned in other answers.当然,如其他答案中所述,您可以始终在 Python 3 (>=3.3) 中使用Pool.starmap

Here is another way to do it that IMHO is more simple and elegant than any of the other answers provided.这是另一种方法,恕我直言,它比提供的任何其他答案都更简单和优雅。

This program has a function that takes two parameters, prints them out and also prints the sum:这个程序有一个函数,它接受两个参数,将它们打印出来并打印总和:

import multiprocessing

def main():

    with multiprocessing.Pool(10) as pool:
        params = [ (2, 2), (3, 3), (4, 4) ]
        pool.starmap(printSum, params)
    # end with

# end function

def printSum(num1, num2):
    mySum = num1 + num2
    print('num1 = ' + str(num1) + ', num2 = ' + str(num2) + ', sum = ' + str(mySum))
# end function

if __name__ == '__main__':
    main()

output is:输出是:

num1 = 2, num2 = 2, sum = 4
num1 = 3, num2 = 3, sum = 6
num1 = 4, num2 = 4, sum = 8

See the python docs for more info:有关更多信息,请参阅 python 文档:

https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.pool https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.pool

In particular be sure to check out the starmap function.特别是一定要检查starmap功能。

I'm using Python 3.6, I'm not sure if this will work with older Python versions我使用的是 Python 3.6,我不确定这是否适用于较旧的 Python 版本

Why there is not a very straight-forward example like this in the docs, I'm not sure.为什么文档中没有这样一个非常直接的例子,我不确定。

Another way is to pass a list of lists to a one-argument routine:另一种方法是将列表列表传递给单参数例程:

import os
from multiprocessing import Pool

def task(args):
    print "PID =", os.getpid(), ", arg1 =", args[0], ", arg2 =", args[1]

pool = Pool()

pool.map(task, [
        [1,2],
        [3,4],
        [5,6],
        [7,8]
    ])

One can than construct a list lists of arguments with one's favorite method.可以用自己喜欢的方法构造一个参数列表。

From python 3.4.4, you can use multiprocessing.get_context() to obtain a context object to use multiple start methods:从 python 3.4.4 开始,您可以使用 multiprocessing.get_context() 获取上下文对象以使用多个启动方法:

import multiprocessing as mp

def foo(q, h, w):
    q.put(h + ' ' + w)
    print(h + ' ' + w)

if __name__ == '__main__':
    ctx = mp.get_context('spawn')
    q = ctx.Queue()
    p = ctx.Process(target=foo, args=(q,'hello', 'world'))
    p.start()
    print(q.get())
    p.join()

Or you just simply replace或者你只是简单地替换

pool.map(harvester(text,case),case, 1)

by:经过:

pool.apply_async(harvester(text,case),case, 1)

In the official documentation states that it supports only one iterable argument.在官方文档中声明它只支持一个可迭代参数。 I like to use apply_async in such cases.我喜欢在这种情况下使用 apply_async。 In your case I would do:在你的情况下,我会这样做:

from multiprocessing import Process, Pool, Manager

text = "test"
def harvester(text, case, q = None):
 X = case[0]
 res = text+ str(X)
 if q:
  q.put(res)
 return res


def block_until(q, results_queue, until_counter=0):
 i = 0
 while i < until_counter:
  results_queue.put(q.get())
  i+=1

if __name__ == '__main__':
 pool = multiprocessing.Pool(processes=6)
 case = RAW_DATASET
 m = Manager()
 q = m.Queue()
 results_queue = m.Queue() # when it completes results will reside in this queue
 blocking_process = Process(block_until, (q, results_queue, len(case)))
 blocking_process.start()
 for c in case:
  try:
   res = pool.apply_async(harvester, (text, case, q = None))
   res.get(timeout=0.1)
  except:
   pass
 blocking_process.join()

There are many answers here, but none seem to provide Python 2/3 compatible code that will work on any version.这里有很多答案,但似乎没有一个提供适用于任何版本的 Python 2/3 兼容代码。 If you want your code to just work , this will work for either Python version:如果您希望您的代码正常工作,这将适用于任一 Python 版本:

# For python 2/3 compatibility, define pool context manager
# to support the 'with' statement in Python 2
if sys.version_info[0] == 2:
    from contextlib import contextmanager
    @contextmanager
    def multiprocessing_context(*args, **kwargs):
        pool = multiprocessing.Pool(*args, **kwargs)
        yield pool
        pool.terminate()
else:
    multiprocessing_context = multiprocessing.Pool

After that, you can use multiprocessing the regular Python 3 way, however you like.之后,您可以使用常规的 Python 3 方式进行多处理,但您可以随意使用。 For example:例如:

def _function_to_run_for_each(x):
       return x.lower()
with multiprocessing_context(processes=3) as pool:
    results = pool.map(_function_to_run_for_each, ['Bob', 'Sue', 'Tim'])    print(results)

will work in Python 2 or Python 3.将在 Python 2 或 Python 3 中工作。

text = "test"

def unpack(args):
    return args[0](*args[1:])

def harvester(text, case):
    X = case[0]
    text+ str(X)

if __name__ == '__main__':
    pool = multiprocessing.Pool(processes=6)
    case = RAW_DATASET
    # args is a list of tuples 
    # with the function to execute as the first item in each tuple
    args = [(harvester, text, c) for c in case]
    # doing it this way, we can pass any function
    # and we don't need to define a wrapper for each different function
    # if we need to use more than one
    pool.map(unpack, args)
    pool.close()
    pool.join()

This is an example of the routine I use to pass multiple arguments to a one-argument function used in a pool.imap fork:这是我用来将多个参数传递给pool.imap fork 中使用的单参数函数的例程示例

from multiprocessing import Pool

# Wrapper of the function to map:
class makefun:
    def __init__(self, var2):
        self.var2 = var2
    def fun(self, i):
        var2 = self.var2
        return var1[i] + var2

# Couple of variables for the example:
var1 = [1, 2, 3, 5, 6, 7, 8]
var2 = [9, 10, 11, 12]

# Open the pool:
pool = Pool(processes=2)

# Wrapper loop
for j in range(len(var2)):
    # Obtain the function to map
    pool_fun = makefun(var2[j]).fun

    # Fork loop
    for i, value in enumerate(pool.imap(pool_fun, range(len(var1))), 0):
        print(var1[i], '+' ,var2[j], '=', value)

# Close the pool
pool.close()

For me,below one was a short and straightforward solution:对我来说,下面是一个简短而直接的解决方案:

from multiprocessing.pool import ThreadPool
from functools import partial
from time import sleep
from random import randint

def dosomething(var,s):
    sleep(randint(1,5))
    print(var)
    return var + s

array = ["a", "b", "c", "d", "e"]
with ThreadPool(processes=5) as pool:
    resp_ = pool.map(partial(dosomething,s="2"), array)
    print(resp_)

Output:输出:

a
b
d
e
c
['a2', 'b2', 'c2', 'd2', 'e2']

This might be another option.这可能是另一种选择。 The trick is in the wrapper function that returns another function which is passed in to pool.map .诀窍在于wrapper函数返回另一个传递给pool.map The code below reads an input array and for each (unique) element in it, returns how many times (ie counts) that element appears in the array, For example if the input is下面的代码读取一个输入数组,并且对于其中的每个(唯一)元素,返回该元素在数组中出现的次数(即计数),例如,如果输入是

np.eye(3) = [ [1. 0. 0.]
              [0. 1. 0.]
              [0. 0. 1.]]

then zero appears 6 times and one 3 times然后零出现 6 次,一出现 3 次

import numpy as np
from multiprocessing.dummy import Pool as ThreadPool
from multiprocessing import cpu_count


def extract_counts(label_array):
    labels = np.unique(label_array)
    out = extract_counts_helper([label_array], labels)
    return out

def extract_counts_helper(args, labels):
    n = max(1, cpu_count() - 1)
    pool = ThreadPool(n)
    results = {}
    pool.map(wrapper(args, results), labels)
    pool.close()
    pool.join()
    return results

def wrapper(argsin, results):
    def inner_fun(label):
        label_array = argsin[0]
        counts = get_label_counts(label_array, label)
        results[label] = counts
    return inner_fun

def get_label_counts(label_array, label):
    return sum(label_array.flatten() == label)

if __name__ == "__main__":
    img = np.ones([2,2])
    out = extract_counts(img)
    print('input array: \n', img)
    print('label counts: ', out)
    print("========")
           
    img = np.eye(3)
    out = extract_counts(img)
    print('input array: \n', img)
    print('label counts: ', out)
    print("========")
    
    img = np.random.randint(5, size=(3, 3))
    out = extract_counts(img)
    print('input array: \n', img)
    print('label counts: ', out)
    print("========")

You should get:你应该得到:

input array: 
 [[1. 1.]
 [1. 1.]]
label counts:  {1.0: 4}
========
input array: 
 [[1. 0. 0.]
 [0. 1. 0.]
 [0. 0. 1.]]
label counts:  {0.0: 6, 1.0: 3}
========
input array: 
 [[4 4 0]
 [2 4 3]
 [2 3 1]]
label counts:  {0: 1, 1: 1, 2: 2, 3: 2, 4: 3}
========

Store all your arguments as an ARRAY of TUPLES .将所有参数存储为 TUPLES 数组

Example say normally you call your function as例子说通常你把你的函数称为

def mainImage(fragCoord : vec2, iResolution : vec3, iTime : float) -> vec3:

instead pass one tuple and unpack the arguments而是传递一个元组并解压参数

def mainImage(package_iter) -> vec3: 
    fragCoord=package_iter[0]  
    iResolution=package_iter[1]
    iTime=package_iter[2]

Build up the tuple by using a loop before hand事先使用循环构建元组

    package_iter = [] 
    iResolution = vec3(nx,ny,1)
    for j in range( (ny-1), -1, -1):
        for i in range( 0, nx, 1): 
            fragCoord : vec2 = vec2(i,j)
            time_elapsed_seconds = 10
            package_iter.append(  (fragCoord, iResolution, time_elapsed_seconds)  )

then execute all using map by passing the ARRAY of TUPLES然后通过传递 TUPLES 的 ARRAY 使用 map 执行所有操作

    array_rgb_values = []

    with concurrent.futures.ProcessPoolExecutor() as executor: 
        for  val in executor.map(mainImage, package_iter):          
            fragColor=val
            ir = clip( int(255* fragColor.r), 0, 255)
            ig = clip(int(255* fragColor.g), 0, 255)
            ib= clip(int(255* fragColor.b), 0, 255)

            array_rgb_values.append( (ir,ig,ib) )

I know Python has * and ** for unpacking , but I haven't tried those yet.我知道 Python 有 * 和 ** 用于解包,但我还没有尝试过。 Also better to use a higher level library concurrent futures than the low level multiprocessing library与低级多处理库相比,使用更高级别的库并发期货也更好

import time
from multiprocessing import Pool


def f1(args):
    vfirst, vsecond, vthird = args[0] , args[1] , args[2]
    print(f'First Param: {vfirst}, Second value: {vsecond} and finally third value is: {vthird}')
    pass


if __name__ == '__main__':
    p = Pool()
    result = p.map(f1, [['Dog','Cat','Mouse']])
    p.close()
    p.join()
    print(result)

for python2, you can use this trick对于python2,你可以使用这个技巧

def fun(a,b):
    return a+b

pool = multiprocessing.Pool(processes=6)
b=233
pool.map(lambda x:fun(x,b),range(1000))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM