简体   繁体   English

进程间共享 memory

[英]shared memory between processes

I'm playing around with the multiprocessing module in python and trying to parallelize an algorithm that loops through an list with a different increment value each time (modification of the Sieve of Eratosthenes algorithm).我正在使用 python 中的多处理模块,并尝试并行化一个算法,该算法每次都以不同的增量值循环遍历一个列表(修改埃拉托色尼筛算法)。 Therefore, I want to have a shared list between all of the processes so that all the processes are modifying the same list.因此,我希望在所有进程之间有一个共享列表,以便所有进程都在修改同一个列表。 I've tried with the multiprocessing.Array function, but when I reach the end of the program the array is still unmodified and still contains all 0's (the value that I initialized it to).我已经尝试过使用multiprocessing.Array function,但是当我到达程序末尾时,数组仍然没有被修改并且仍然包含所有 0(我初始化它的值)。

import multiprocessing
import math

num_cores = multiprocessing.cpu_count()

lower = 0
mark = None

def mark_array(k):
    global mark
    index = (-(-lower//k)*k)-lower
    for i in range(index, len(mark), k):
        mark[i] = 1

def sieve(upper_bound, lower_bound):
    size = upper_bound - lower_bound + 1

    global mark
    mark = multiprocessing.Array('i', size, lock=False)
    for i in range(size):
        mark[i] = 0

    klimit = int(math.sqrt(upper_bound)) + 1
    global lower
    lower = lower_bound

    if __name__ == '__main__':
        pool = multiprocessing.Pool(processes=num_cores)
        inputs = list(range(2, klimit+1))
        pool.map(mark_array, inputs)
        pool.close()
        pool.join()

        result = []
        for i in range(size):
            result.append(mark[i])
        print(result)

sieve(200,100)

Pardon the code.原谅代码。 It's a bit messy, but I'm just trying to get the shared memory to work before I clean it up.这有点乱,但我只是想在清理它之前让共享的 memory 工作。

EDIT: Ok, so I tried the exact same code on a linux machine and there I get my expected output.编辑:好的,所以我在 linux 机器上尝试了完全相同的代码,我得到了我预期的 output。 However, running the same code in VS code on a Windows machine does not.但是,在 Windows 机器上的 VS 代码中运行相同的代码不会。 Any idea why?知道为什么吗?

EDIT#2: This seems to be a Windows specific issue as the Windows OS handles processes differently than Linux.编辑#2:这似乎是 Windows 特定问题,因为 Windows 操作系统处理进程的方式与 Linux 不同。 If this is the case, any idea how to solve it?如果是这种情况,知道如何解决吗?

You could try to use multiprocessing.Manager for your task:您可以尝试使用 multiprocessing.Manager 来完成您的任务:

import multiprocessing
import math
from functools import partial

num_cores = multiprocessing.cpu_count()

lower = 0


def mark_array(mark, k):
    index = (-(-lower // k) * k) - lower
    for i in range(index, len(mark), k):
        mark[i] = 1


def sieve(upper_bound, lower_bound):
    size = upper_bound - lower_bound + 1

    klimit = int(math.sqrt(upper_bound)) + 1
    global lower
    lower = lower_bound

    if __name__ == '__main__':
        pool = multiprocessing.Pool(processes=num_cores)
        with multiprocessing.Manager() as manager:
            mark = manager.list(range(size))
            for i in range(size):
                mark[i] = 0

            inputs = list(range(2, klimit + 1))
            foo = partial(mark_array, mark)

            pool.map(foo, inputs)
            pool.close()
            pool.join()

            result = []
            for i in range(size):
                result.append(mark[i])
            print(result)


sieve(200, 100)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 python进程之间的共享内存 - Shared memory between python processes Gunicorn在多处理流程和工作人员之间共享内存 - Gunicorn shared memory between multiprocessing processes and workers 进程池中的共享内存 - Shared memory in pool of processes 在循环内包装多进程池(进程之间的共享内存) - Wrap Multiprocess Pool Inside Loop (Shared Memory Between Processes) 通过Python中的套接字在两个进程之间传递共享内存对象 - Passing shared memory object between two processes via socket in Python 进程池中进程之间共享的类属性和内存? - class attributes and memory shared between Processes in process pool? 如何使用共享内存而不是通过多个进程之间的酸洗传递对象 - How to use shared memory instead of passing objects via pickling between multiple processes python 多处理 - 在进程之间共享类字典,随后从进程写入反映到共享 memory - python multiprocessing - Sharing a dictionary of classes between processes with subsequent writes from the process reflected to the shared memory RawArray 未被进程修改为 Python 多处理的共享内存 - RawArray not modified by processes as shared memory for Python multiprocessing Python3:包含进程之间字符串的共享数组 - Python3: Shared Array with strings between processes
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM