[英]Multi-layered bidirectional_dynamic_rnn: incompatible with MultiRNNCell?
I want to create a multi-layered bidirectional LSTM in Tensorflow. 我想在Tensorflow中创建一个多层双向LSTM。 Currently my single-layered model looks like: 目前我的单层模型看起来像:
cell_fw = tf.contrib.rnn.LSTMCell(hidden_size)
cell_bw = tf.contrib.rnn.LSTMCell(hidden_size)
(self.out_fw, self.out_bw), _ = tf.nn.bidirectional_dynamic_rnn(cell_fw, cell_bw, input, ...)
In order to turn this into a multi-layered I suspect I can not simply wrap a few LSTMCell
s with MultiRNNCell
s like so: 为了把它变成一个多层的我怀疑我不能简单地用MultiRNNCell
包装一些LSTMCell
, MultiRNNCell
所示:
multi_cell_fw = tf.contrib.rnn.MultiRNNCell([cell_fw] * num_layers, ...)
and feed them into the bidirectional_dynamic_rnn
since both forward and backward LSTMs in each layer need the output of both the forward and backward directions of the preceding layer. 和饲料他们进入bidirectional_dynamic_rnn
由于每个层向前和向后两个LSTMs需要两个前一层的向前和向后的方向的输出。 Currently my solution is to create my bidirectional_dynamic_rnn
s in a loop, feeding in the concatenated output of LSTMs of the preceding layers. 目前我的解决方案是创建我bidirectional_dynamic_rnn
在一个循环秒,在前述层的LSTMs的级联输出馈送。
However, it's not very clean and frankly I'm not sure if it's correct, though it does work on a toy dataset. 然而,它不是很干净,坦率地说我不确定它是否正确,尽管它确实适用于玩具数据集。 Is there a better way that's comparably elegant to using something like MultiRNNCell
? 使用像MultiRNNCell
这样的东西有更好的方式吗?
I'm using Tensorflow API r1.0. 我正在使用Tensorflow API r1.0。
Just do: 做就是了:
multi_cell_fw = tf.contrib.rnn.MultiRNNCell([cell_fw for _ in range(num_layers)], ...)
That should work. 这应该工作。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.