简体   繁体   中英

Python: How much space does each element of a list take?

I need a very large list, and am trying to figure out how big I can make it so that it still fits in 1-2GB of RAM. I am using the CPython implementation, on 64 bit (x86_64).

Edit: thanks to bua's answer, I have filled in some of the more concrete answers.

What is the space (memory) usage of (in bytes):

  • the list itself
    • sys.getsizeof([]) == 72
  • each list entry (not including the data)
    • sys.getsizeof([0, 1, 2, 3]) == 104 , so 8 bytes overhead per entry.
  • the data if it is an integer
    • sys.getsizeof(2**62) == 24 (but varies according to integer size)
    • sys.getsizeof(2**63) == 40
    • sys.getsizeof(2**128) == 48
    • sys.getsizeof(2**256) == 66
  • the data if it is an object ( sizeof(Pyobject) I guess))
    • sys.getsizeof(C()) == 72 (C is an empty user-space object)

If you can share more general data about the observed sizes, that would be great. For example:

  • Are there special cases (I think immutable values might be shared, so maybe a list of bools doesn't take any extra space for the data)?
  • Perhaps small lists take X bytes overhead but large lists take Y bytes overhead?

point to start:

>>> import sys
>>> a=list()
>>> type(a)
<type 'list'>
>>> sys.getsizeof(a)
36
>>> b=1
>>> type(b)
<type 'int'>
>>> sys.getsizeof(b)
12

and from python help:

>>> help(sys.getsizeof)
Help on built-in function getsizeof in module sys:

getsizeof(...)
    getsizeof(object, default) -> int

    Return the size of object in bytes.

If you want lists of numerical values, the standard array module provides optimized arrays (that have an append method).

The non-standard, but commonly used NumPy module gives you fixed-size efficient arrays.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM