简体   繁体   English

python中共享数组的RawArray的最大大小

[英]Max size of RawArray for shared array in python

When trying to create an array to use for shared memory in multiple processes, I am getting an assertion error: 尝试创建用于多个进程中的共享内存的数组时,出现断言错误:

    shared_array = RawArray('d', xsize)
  File "C:\Python27\lib\multiprocessing\sharedctypes.py", line 88, in RawArray
    obj = _new_value(type_)
  File "C:\Python27\lib\multiprocessing\sharedctypes.py", line 68, in _new_value
    wrapper = heap.BufferWrapper(size)
  File "C:\Python27\lib\multiprocessing\heap.py", line 242, in __init__
    assert 0 <= size < sys.maxint
AssertionError

It seems that it is above some maxint number, however, even when I run a basic example like below: 但是,即使我运行如下所示的基本示例,它似乎仍在maxint数字之上:

from multiprocessing.sharedctypes import RawArray
import sys

xsize = 999999999
#create an empty array    
print('MaxInt:',sys.maxint)
print('My Size:',xsize)
shared_array = RawArray('d', xsize)

The print statements show: 打印语句显示:

('MaxInt:', 2147483647)
('My Size:', 999999999)

Why is this happening, and how can I make a shared array for multiprocessing when having very large arrays? 为什么会发生这种情况?当具有非常大的数组时,如何为多处理创建共享数组? My computer has 128GB of RAM, so that shouldn't be the issue. 我的计算机有128GB的RAM,所以这不应该成为问题。

an int here will be a "C style" int, which tends to be 4 bytes, but you ask for it, and hence the maximum size of an array by doing: 这里的int将是“ C样式” int,它通常为4个字节,但是您需要这样做,因此可以通过执行以下操作来获得数组的最大大小:

from sys import maxint
from ctypes import sizeof, c_int

maxint // sizeof(c_int)

which for you (having a 32bit build) will be ~512M elements. 对您来说(具有32位版本)将是大约512M元素。 However if you are running a 32bit build then your address space will be a much greater limiting factor, and you almost certainly won't be able to allocate an array that big. 但是,如果您运行的是32位版本,则地址空间将是一个更大的限制因素,并且几乎可以肯定,您将无法分配那么大的数组。

I'm using a 64bit builds of Python 2.7 and 3.7, where sys.maxint and sys.maxsize are both 2**63-1 , and both allow me to allocate an array with a billion ( 10**9 ) elements (and I can see Python having a ~4GB RSS ), albeit taking a while to zero it all out. 我使用的是Python 2.7和3.7的64位版本,其中sys.maxintsys.maxsize均为2**63-1 ,并且都允许我分配具有十亿( 10**9 )个元素的数组(并且我可以看到Python的〜4GB RSS ),尽管花了一段时间才能将其全部归零。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM