简体   繁体   English

Python:在for循环内对函数调用进行多重处理,以便每个调用独立运行

[英]Python: Multi-processing a function call inside for loop so that each call runs independently

I have a function that encrypts a number and stores it in an list 我有一个加密数字并将其存储在列表中的函数

encrypted = [[0]*10]*1000

def encrypt(i):
        encrypted[i]=bin(i)[2:].zfill(10).decode('hex')

The expression is much more complex than this. 表达式比这复杂得多。 I am just stating an example. 我只是说一个例子。

Now I want to call the encrypt function inside a for loop with multiple calls in different processes or threads - however due to GIL for CPU bound process, threads wont help - correct me if i am wrong. 现在,我想在不同的进程或线程中有多个调用的for循环内调用crypto函数-但是由于GIL for CPU绑定进程,线程将无济于事-如果我错了,请纠正我。

for i in xrange(1000):
     encrypt(i)

So The loop should not wait for the encryption of one value to get over, for the next to start. 因此,循环不应等待一个值的加密结束,而下一个开始。

So when i=1 and encryption of 1 is taking place, For loop should increment and start encrypting 2, and then 3 simultaneously. 因此,当i = 1且正在进行1加密时,For循环应递增并开始加密2,然后同时进行3加密。

The results of encryption should be stored in encrypted list (order of results is not important). 加密结果应存储在加密列表中(结果的顺序并不重要)。

Alright, first some advice. 好吧,首先提供一些建议。 Depending on the number of threads you need to run you should check out PyPy this sounds like the kind of project that could benefit heavily from pypy's features. 根据您需要运行的线程数,您应该检查一下PyPy,这听起来像是一种可以从pypy功能中大量受益的项目。

Here Is an edited example from the Queue docs if I understand what you need than this should point you in the right direction. 如果我了解您的需求,那么这是Queue文档中的一个经过编辑的示例,这应该为您指明正确的方向。

This code assumes that you have a list of encrypted numbers and that your encrypt function handles adding the results to a list or storing them somehow. 此代码假定您具有一个加密数字列表,并且您的加密函数可以将结果添加到列表中或以某种方式存储它们。

def worker():
    while True:
        number = q.get()
        encrypt(number)
        q.task_done()

q = Queue()
for i in range(num_worker_threads):
    t = Thread(target=worker)
    t.daemon = True
    t.start()

for number in numbers:
    q.put(number)

q.join()       # block until all tasks are done

You can use multithreading.Pool 您可以使用multithreading.Pool

from multiprocessing import Pool

def encrypt(i):
    return bin(i)[2:].zfill(10).decode('hex')

if __name__ == '__main__':
    pool = Pool(processes=4)  # adjust to number of cores
    result = pool.map(encrypt, range(1000))
    print result

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM