简体   繁体   English

如何在 keras 中包装一个 tensorflow RNNCell?

[英]How to wrap a tensorflow RNNCell in keras?

I would like to implement an custom LSTM cell as in a keras layer.我想在 keras 层中实现自定义 LSTM 单元。 Actually this implementation exists in tensorflow, so I was wondering if it is possible to just wrap it as an keras layer and call it in the model.实际上这个实现存在于tensorflow中,所以我想知道是否可以将它包装成一个keras层并在模型中调用它。

I found official documentation too simplified to see how to build a custom RNN layer.我发现官方文档过于简单,无法了解如何构建自定义 RNN 层。 There are similar questions here and here , but they seem unresolved. 这里这里也有类似的问题,但它们似乎没有得到解决。

Thanks in advance for your help !在此先感谢您的帮助 !

From my understanding, you should just be able to initialize the cell in the init() of the class layer, and then inside the call method reference it with your input.根据我的理解,您应该能够在类层的 init() 中初始化单元格,然后在调用方法中使用您的输入引用它。

Ex:前任:

class MySimpleLayer(Layer):
  def __init__(self, lstm_size):
    super(MySimpleLayer, self).__init__()
    self.lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)

  def call(self, batch, state):
    return self.lstm(batch, state)

layer = MySimpleLayer(lstm_size)
logits = layer(batch, state)

This implementation is as basic as it gets though, so you might need to look into the build() and compute_output_shape() methods for more complex use cases.这个实现是最基本的,所以你可能需要研究 build() 和 compute_output_shape() 方法来了解更复杂的用例。

Now the documentation of tensorflow may have improved since the question is posted.现在,自问题发布以来,tensorflow 的文档可能有所改进。

You may want to check this guide or this SO answer for reference.您可能需要查看本指南本 SO 答案以供参考。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM