簡體   English   中英

Keras雙向層的序列長度是多少?

[英]What's the sequence length of a Keras Bidirectional layer?

如果我有:

        self.model.add(LSTM(lstm1_size, input_shape=(seq_length, feature_dim), return_sequences=True))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

然后我的seq_length指定我想一次處理多少個數據切片。 如果重要的話,我的模型是一個序列到序列(相同大小)。

但如果我有:

        self.model.add(Bidirectional(LSTM(lstm1_size, input_shape=(seq_length, feature_dim), return_sequences=True)))
        self.model.add(BatchNormalization())
        self.model.add(Dropout(0.2))

那么這是使序列大小加倍嗎? 或者在每個時間步長之前和之后是否都獲得了seq_length / 2

使用雙向 LSTM 層對序列長度沒有影響。 我使用以下代碼對此進行了測試:

from keras.models import Sequential
from keras.layers import Bidirectional,LSTM,BatchNormalization,Dropout,Input

model = Sequential()
lstm1_size = 50
seq_length = 128
feature_dim = 20
model.add(Bidirectional(LSTM(lstm1_size, input_shape=(seq_length, feature_dim), return_sequences=True)))
model.add(BatchNormalization())
model.add(Dropout(0.2))

batch_size = 32

model.build(input_shape=(batch_size,seq_length, feature_dim))

model.summary()

這導致以下雙向輸出

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
bidirectional_1 (Bidirection (32, 128, 100)            28400     
_________________________________________________________________
batch_normalization_1 (Batch (32, 128, 100)            400       
_________________________________________________________________
dropout_1 (Dropout)          (32, 128, 100)            0         
=================================================================
Total params: 28,800
Trainable params: 28,600
Non-trainable params: 200

無雙向層:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_1 (LSTM)                (None, 128, 50)           14200     
_________________________________________________________________
batch_normalization_1 (Batch (None, 128, 50)           200       
_________________________________________________________________
dropout_1 (Dropout)          (None, 128, 50)           0         
=================================================================
Total params: 14,400
Trainable params: 14,300
Non-trainable params: 100
_________________________________________________________________

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM