简体   繁体   English

如何在功能 API 中实现双向包装?

[英]How to implement a bidirectional wrapper in functional API?

Does the bidirectional layer connect encoder to decoder or decoder to decoder.双向层是将编码器连接到解码器还是将解码器连接到解码器。 This is the 3 parts of the encoder which feed to the decoders below.这是编码器的 3 个部分,它们馈送到下面的解码器。

#encoding layers
input_context = Input(shape = (maxLen, ), dtype = 'int32', name = 'input_context')
input_ctx_embed = embed_layer(input_context)
encoder_lstm, h1, c1 = LSTM(256, return_state = True, return_sequences = True)(input_ctx_embed)
encoder_lstm2,h2, c2 = LSTM(256, return_state = True, return_sequences = True)(encoder_lstm)
_,h3, c3 = LSTM(256, return_state = True)(encoder_lstm2)
encoder_states = [h1, c1, h2, c2,h3,c3]

#layers for the decoder
input_target = Input(shape = (maxLen, ), dtype = 'int32', name = 'input_target')
input_tar_embed = embed_layer(input_target)
# the decoder lstm uses the final states from the encoder lstm as the initial state
decoder_lstm, context_h, context_c = LSTM(256, return_state = True, return_sequences = True) 
         (input_tar_embed, initial_state = [h1, c1],)
decoder_lstm2, context_h2, context_c2 = LSTM(256, return_state = True, return_sequences = True) 
         (decoder_lstm, initial_state = [h2, c2],)
final, context_h3, context_c3 = LSTM(256, return_state = True, return_sequences = True) 
        (decoder_lstm2, initial_state = [h3, c3],)
dense_layer=Dense(vocab_size, activation = 'softmax')
output = TimeDistributed(dense_layer)(final)
#output=Dropout(0.3)(output)
model = Model([input_context, input_target], output)

Not sure where the bidirectional layer is, since in my opinion, if you would like to use keras.layers.LSTM() to build a Bidirectional RNN structure without using keras.layer.Bidirectional() , then there's one setting in keras.layers.LSTM() which is called go_backwards and its default is False , set it True makes the LSTM going backward.不确定双向层在哪里,因为在我看来,如果您想使用keras.layers.LSTM()来构建双向RNN 结构而不使用keras.layer.Bidirectional() ,那么在 keras.layer.Bidirectional() 中有一个设置,那么在keras.layers.LSTM() () 构建双向 RNN 结构,那么在 keras.layer.Bidirectional() 中有一个设置。 keras.layers.LSTM()称为go_backwards ,默认为False ,设置为True会使 LSTM 后退。 And if you are just asking where to put Bidirectional LSTM in a encoder-decoder structure, then my answer will be "you can put it wherever you want , if that way make your model better ."如果您只是询问将双向 LSTM放在编码器-解码器结构中的哪个位置,那么我的答案将是“您可以将它放在任何您想要的地方,如果这样可以让您的model 更好的话。”

If I mixed up anything, let me know.如果我混淆了什么,请告诉我。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在tensorflow中实现双向编码器? - How to implement bidirectional encoder in tensorflow? 如何实现高效的双向哈希表? - How to implement an efficient bidirectional hash table? 如何在pytorch中实现金字塔形双向LSTM(pBLSTM) - how to implement pyramidal bidirectional LSTM (pBLSTM) in pytorch 如何在张量流中向多层双向lstm添加Highway Wrapper - how to add Highway Wrapper to multilayered bidirectional lstm in tensorflow 如何实现stdin,stdout包装器? - How to implement a stdin, stdout wrapper? 如何实现多处理 python 包装器? - How to implement an multiprocessing python wrapper? 如何使用功能 API 模型实现 CNN 并解决 keras 层中的“_keras_shape”错误 - How do I implement CNN using Functional API model and resolve '_keras_shape' error in keras layers 如何在python的函数式编程中实现嵌套的for循环? - How to implement a nested for loop in functional programming in python? SymPy:如何实现泛函和泛函派生 - SymPy: How to implement functionals and functional derivatives 当使用双向包装器时,如何在LSTM层中获得最终隐藏状态和序列 - how could i get both the final hidden state and sequence in a LSTM layer when using a bidirectional wrapper
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM