简体   繁体   中英

Getting a pickle error when trying to run processes

What I'm trying to do is running a list of prime number decomposition in different processes at once. I have a threaded version that's working, but can't seem to get it working with processes.

import math
from Queue import Queue
import multiprocessing

def primes2(n):
    primfac = []
    num = n
    d = 2
    while d * d <= n:
        while (n % d) == 0:
            primfac.append(d) # supposing you want multiple factors repeated
            n //= d
        d += 1
    if n > 1:
        primfac.append(n)
    myfile = open('processresults.txt', 'a')
    myfile.write(str(num) + ":" + str(primfac) + "\n")
    return primfac

def mp_factorizer(nums, nprocs):
    def worker(nums, out_q):
        """ The worker function, invoked in a process. 'nums' is a
            list of numbers to factor. The results are placed in
            a dictionary that's pushed to a queue.
        """
        outdict = {}
            for n in nums:
            outdict[n] = primes2(n)
        out_q.put(outdict)

    # Each process will get 'chunksize' nums and a queue to put his out
    # dict into
    out_q = Queue()
    chunksize = int(math.ceil(len(nums) / float(nprocs)))
    procs = []

    for i in range(nprocs):
        p = multiprocessing.Process(
                target=worker,
                args=(nums[chunksize * i:chunksize * (i + 1)],
                      out_q))
        procs.append(p)
        p.start()

    # Collect all results into a single result dict. We know how many dicts
    # with results to expect.
    resultdict = {}
    for i in range(nprocs):
        resultdict.update(out_q.get())

    # Wait for all worker processes to finish
    for p in procs:
        p.join()

    print resultdict

if __name__ == '__main__':

    mp_factorizer((400243534500, 100345345000, 600034522000, 9000045346435345000), 4)

I'm getting a pickle error shown below:

错误图片

Any help would be greatly appreciated :)

You need to use multiprocessing.Queue instead of regular Queue . +more

This is due the Process doesn't run using the same memory space and there are some objects that aren't pickable , like the a regular queue ( Queue.Queue ). To overcome this, the multiprocessing library provide a Queue class that is actually a Proxy to a Queue.

And also, you could extract the def worker(.. out as any other method. This could be your main problem because on "how" a process is forked on a OS level.

You can also use a multiprocessing.Manager +more .

动态创建的功能不能进行酸洗,因此不能被用作目标Process ,函数worker需要在全球范围内,而不是定义内的被限定mp_factorizer

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM