简体   繁体   中英

Why can python lists hold more data than numpy arrays?

Sorry, I feel like this may be a basic question, but I did not find any "solutions" for this.

I am filling a python list with a lot of data, and finally want to convert it to a numpy.array for further processing.

However, when I call numpy.asarray(my_list) , I get an out of memory error. Why does that happen? Is it because numpy.array objects are stored in consecutive memory blocks, and there is not enough memory space for that?

How do I best treat such great data volumes then? I guess numpy is definitely the way to go, so I am a bit curious, that I can handle such volumes with simple list objects but not with my current numpy approach.

Again, repeating my most important question: How can I best handle data, which fits into python lists (so I guess overall it somehow still fits in my memory), but cannot be converted to a numpy.array ?

Thanks!

Allocate the memory for a numpy array and never create a list in the first place.

memmap should not be neccessary as the original list fits in memory.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM