When trying to create an array to use for shared memory in multiple processes, I am getting an assertion error:
shared_array = RawArray('d', xsize)
File "C:\Python27\lib\multiprocessing\sharedctypes.py", line 88, in RawArray
obj = _new_value(type_)
File "C:\Python27\lib\multiprocessing\sharedctypes.py", line 68, in _new_value
wrapper = heap.BufferWrapper(size)
File "C:\Python27\lib\multiprocessing\heap.py", line 242, in __init__
assert 0 <= size < sys.maxint
AssertionError
It seems that it is above some maxint number, however, even when I run a basic example like below:
from multiprocessing.sharedctypes import RawArray
import sys
xsize = 999999999
#create an empty array
print('MaxInt:',sys.maxint)
print('My Size:',xsize)
shared_array = RawArray('d', xsize)
The print statements show:
('MaxInt:', 2147483647)
('My Size:', 999999999)
Why is this happening, and how can I make a shared array for multiprocessing when having very large arrays? My computer has 128GB of RAM, so that shouldn't be the issue.
an int
here will be a "C style" int, which tends to be 4 bytes, but you ask for it, and hence the maximum size of an array by doing:
from sys import maxint
from ctypes import sizeof, c_int
maxint // sizeof(c_int)
which for you (having a 32bit build) will be ~512M elements. However if you are running a 32bit build then your address space will be a much greater limiting factor, and you almost certainly won't be able to allocate an array that big.
I'm using a 64bit builds of Python 2.7 and 3.7, where sys.maxint
and sys.maxsize
are both 2**63-1
, and both allow me to allocate an array with a billion ( 10**9
) elements (and I can see Python having a ~4GB RSS ), albeit taking a while to zero it all out.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.