[英]How much memory in numpy array? Is RAM a limiting factor?
I'm using numpy to create a cube array with sides of length 100, thus containing 1 million entries total. 我正在使用numpy创建一个边长为100的立方体数组,因此总共包含100万个条目。 For each of the million entries, I am inserting a 100x100 matrix whose entries are comprised of randomly generated numbers.
对于每百万个条目,我插入一个100x100矩阵,其条目由随机生成的数字组成。 I am using the following code to do so:
我使用以下代码来执行此操作:
import random
from numpy import *
cube = arange(1000000).reshape(100,100,100)
for element in cube.flat:
matrix = arange(10000).reshape(100,100)
for entry in matrix.flat:
entry = random.random()*100
element = matrix
I was expecting this to take a while, but with 10 billion random numbers being generated, I'm not sure my computer can even handle it. 我期待这需要一段时间,但是生成了100亿个随机数,我不确定我的电脑是否可以处理它。 How much memory would such an array take up?
这样一个阵列会占用多少内存? Would RAM be a limiting factor, ie if my computer doesn't have enough RAM, could it fail to actually generate the array?
RAM是一个限制因素,即如果我的计算机没有足够的RAM,它是否无法实际生成阵列?
Also, if there is a more efficient to implement this code, I would appreciate tips :) 此外,如果有更高效的实现此代码,我会很感激提示:)
A couple points: 几点:
cube.dtype
is int64
, and it has 1,000,000 elements, it will require 1000000 * 64 / 8 = 8,000,000
bytes (8Mb). cube.dtype
是int64
,并且它有1,000,000个元素,那么它将需要1000000 * 64 / 8 = 8,000,000
cube.dtype
1000000 * 64 / 8 = 8,000,000
字节(8Mb)。 cube
, element = matrix
will simply overwrite the element
variable, leaving the cube
unchanged. element = matrix
不是替换cube
的元素,而是简单地覆盖element
变量,保持cube
不变。 The same goes for the entry = random.rand() * 100
. entry = random.rand() * 100
。 for the "inner" part of your function, look at the numpy.random module 对于函数的“内部”部分,请查看numpy.random模块
import numpy as np
matrix = np.random.random((100,100))*100
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.