簡體   English   中英

了解Keras LSTM Tensorboard圖

[英]Understanding Keras LSTM Tensorboard Graph

我對在Tensborboad中獲得的Keras LSTM網絡圖感到困惑。 我已經這樣定義了Keras LSTM網絡:

model = Sequential()
model.add(LSTM(neurons, return_sequences=True,input_shape=(look_back,2)))
#model.add(Bidirectional(LSTM(neurons, return_sequences=True),input_shape=(look_back,2)))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(LSTM(neurons,return_sequences=True,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))   
model.add(LSTM(20,return_sequences=False,recurrent_regularizer=l2(weight_decay),
          kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),dropout=dropout,recurrent_dropout=dropout))
model.add(Dense(outputs,kernel_regularizer=l2(weight_decay),bias_regularizer=l2(weight_decay),activation='linear'))
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])

我認為這將為我提供一個順序模型,其中每個LSTM都采用先前LSTM的輸出。 我有點明白。 但是我也將LSTM層之一作為每個后續層的輸入:

在此處輸入圖片說明

在該圖中,看起來lstm_2進入了每個圖層。 我不會期望的。 所以我的問題是,這是預期的嗎? 如果是這樣,為什么?

謝謝。

我弄清楚了為什么會這樣顯示。 事實證明,Keras創建了learning_phase占位符,並將其放置在第二個隱藏層中。 learning_phase對象分支到每個單獨的層,但是LSTM本身沒有。 我會參考此答案以獲取更多詳細信息。

這是我的Tensorboard圖中LSTM_1層的內部外觀:

在此處輸入圖片說明

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM