I want to investigate the memory usage of a python program that uses numpy.memmap to access data from large files. Is there a way to check the size in memory that a memmap is currently using? I tried sys.getsizeof on the numpy object and the _mmap attribute of the object, but both gave the same very small size regardless of how much the memmap object had been used.
I found that array.base
gives you a mmap.mmap
instance (assuming that array
is a numpy.memmap
instance). mmap.mmap
supports len
, so I am now using:
def recursive_size_of(list_or_array) -> int:
result = sys.getsizeof(list_or_array)
if isinstance(list_or_array, list):
result += sum(map(recursive_size_of, list_or_array))
elif isinstance(list_or_array, mmap.mmap):
result += len(list_or_array)
else:
base_array = getattr(list_or_array, 'base', None)
if base_array is not None:
result += recursive_size_of(base_array)
return result
If you know that your array is exactly a numpy.memmap
and feel confident about making that assumption in your code, len(array.base)
would suffice.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.