简体   繁体   中英

How much memory is used by a numpy ndarray?

Does anybody know how much memory is used by a numpy ndarray? (with let's say 10,000,000 float elements).

The array is simply stored in one consecutive block in memory. Assuming by "float" you mean standard double precision floating point numbers, then the array will need 8 bytes per element.

In general, you can simply query the nbytes attribute for the total memory requirement of an array, and itemsize for the size of a single element in bytes:

>>> a = numpy.arange(1000.0)
>>> a.nbytes
8000
>>> a.itemsize
8

In addtion to the actual array data, there will also be a small data structure containing the meta-information on the array. Especially for large arrays, the size of this data structure is negligible.

To get the total memory footprint of the NumPy array in bytes, including the metadata, you can use Python's sys.getsizeof() function:

import sys
import numpy as np

a = np.arange(1000.0)

sys.getsizeof(a)

8104 bytes is the result

sys.getsizeof() works for any Python object. It reports the internal memory allocation, not necessarily the memory footprint of the object once it is written out to some file format. Sometimes it is wildly misleading. For example, with 2d arrays, it doesn't dig into the memory footprint of vector.

See the docs here . Ned Batcheldor shares caveats with using sys.getsizeof() here .

我高斯,很容易,我们可以通过print(a.size // 1024 // 1024, a.dtype)它类似于print(a.size // 1024 // 1024, a.dtype)有多少MB ,但是使用参数dtype , float=8B, int8=1B .. .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM