简体   繁体   English

喀拉斯邦神经网络的权重倾销混乱

[英]Confusion with weights dumping from neural net in keras

I created a simple 2-layer network, one hidden layer. 我创建了一个简单的2层网络,即一个隐藏层。 I am dumping the weights from the middle layer to visualize what the hidden neurons are learning. 我正在从中间层转移权重以可视化隐藏神经元正在学习的内容。 I am using 我在用

weights = model.layers[0].get_weights() 

When I look at the weights structure I get: 当我查看权重结构时,我得到:

So len(weights) = 2 , len(weights[0]) = 500 , len(weights[1]) = 100 . 因此len(weights) = 2len(weights[0]) = 500len(weights[1]) = 100

I want to create an array m of size (500,100) , so that m.shape = (500,100) . 我想创建一个大小为(500,100)的数组m,这样m.shape = (500,100) I tried numpy.reshape(weights, 500, 100) , zip(weights[0], weights[1]) , then, by chance, I wrote numpy.array(weights[0]) and this came back with shape (500,100) . 我尝试了numpy.reshape(weights, 500, 100)zip(weights[0], weights[1]) ,然后偶然地写了numpy.array(weights[0]) ,结果形状恢复为(500,100)

Can someone explain why? 有人可以解释为什么吗?

The Keras tensors work differently, they are n-dimensional lists. Keras张量的工作方式不同,它们是n维列表。 To illustrate the concept consider the list: 为了说明这一概念,请考虑以下列表:

>>> list=[[[1,2,3],[1,2,3],[1,2,3],[1,2,3],[1,2,3]],[1,2,3]]

Here, the first element in list contains n-length elements and second list can also be an n-length elements. 在此,列表中的第一个元素包含n个长度的元素,第二个列表也可以是n个长度的元素。 When you do: 当您这样做时:

>>> len(list)

Output is: 输出为:

2( which is 2 in your case)

Also, 也,

>>> len(list[0])

5(which is 500 in your case)

>>> len(list[1])

3(which is 100 in your case)

But when you try to convert to array: 但是,当您尝试转换为数组时:

>>> np.array(list[0]).shape

The answer is: 答案是:

(5, 3) (which is 500,100 in your case)

This is because you are having an n-length list element inside your list[0] (which is weights[0] in your case). 这是因为您的list [0]中有一个n长度的list元素(在您的情况下为weights [0])。 So when I asked you to return 所以当我问你回来

len(weights[0][0]) 

it returned: 它返回:

100

because it contains 100 length elements in that list and 500 such elements in it. 因为它在该列表中包含100个length元素,在其中包含500个此类元素。 Now, if you are wondering what does each 100 values mean, so they are corressponding weights of the connections ie 现在,如果您想知道每100个值是什么意思,那么它们对应的连接权重即

weights[0][0] = weights between first input to all 100 hidden neurons

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM