简体   繁体   English

如何扩展神经网络层?

[英]How to expand neural network layer?

I am trying to write an expandable neural network described in this paper: Lifelong Learning with Dynamically Expandable Networks (Jeongtae Lee, Jaehong Yoon, Eunho Yang, Sung Ju Hwang, Aug 2017). 我正在尝试编写本文描述的可扩展神经网络:使用动态可扩展网络进行终生学习(Jeongtae Lee,Jaehong Yoon,Eunho Yang,Sung Ju Hwang,2017年8月)。

Now consider I have such a LSTM layer: 现在考虑我拥有这样的LSTM层:

tf.contrib.rnn.LSTMCell(128, state_is_tuple=True)

Say I want to leave the gates for now and just want to expand the neurons of this LSTM layer from 128 to 256 while retaining the previous weights and make the new weights 0.000001 which makes it insignificant for the following layers. 假设我现在要离开大门,只想将该LSTM层的神经元从128扩展到256,同时保留先前的权重,并使新的权重为0.000001,这对于随后的层就无关紧要了。 How can I do that? 我怎样才能做到这一点?

Also, is it possible to change the input size and retain the weights by assigning zero weights to the new features at the beginning? 另外,是否可以通过在开始时为新功能分配零权重来更改输入大小并保留权重? For example, if I have input like this: 例如,如果我有这样的输入:

inputs = tf.placeholder(tf.float32, [None, 30, 2])

and I want to change it to: 我想将其更改为:

inputs = tf.placeholder(tf.float32, [None, 30, 5])

Then, what should I do to my layers to load the previously trained weights and padding zeros to fit the input shape? 然后,我应该对我的图层执行哪些操作以加载先前训练的权重和填充零点以适合输入形状?

An answer to any other types of layers will be very much appreciated as well. 任何其他类型的层的答案也将不胜感激。 Literally, any help will be appreciated. 从字面上看,任何帮助将不胜感激。

  1. 新的权重应初始化为随机值

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM