[英]How to create autoencoder using dropout in Dense layers using Keras
我正在嘗試重建數字數據集的輸出,為此我正在嘗試使用自動編碼器的不同方法。 一種方法是在密集層中使用輟學。
如圖所示,使用編碼器和解碼器的兩個部分,尺寸在中心減小。 這就是問題開始的地方,因為帶有Dropout的密集層不會拾取。
請注意,這是我嘗試自動編碼器的第二種方法,如此處所示,我已經完成了 。
這是我(天真的)寫的:
from keras import models
from keras import layers
from keras import backend as K
network = models.Sequential()
input_shape = x_train_clean.shape[1] # input_shape = 3714
outer_layer = int(input_shape / 7)
inner_layer = int(input_shape / 14)
network.add(Dropout(0.2, input_shape=(input_shape,)))
network.add(Dense(units=outer_layer, activation='relu'))
network.add(Dropout(0.2))
network.add(Dense(units=inner_layer, activation='relu'))
network.add(Dropout(0.2))
network.add(Dense(units=10, activation='linear'))
network.add(Dropout(0.2))
network.add(Dense(units=inner_layer, activation='relu'))
network.add(Dropout(0.2))
network.add(Dense(units=outer_layer, activation='relu'))
network.add(Dropout(0.2))
network.compile(loss=lambda true, pred: K.sqrt(K.mean(K.square(pred-true))), # RMSE
optimizer='rmsprop', # Root Mean Square Propagation
metrics=['accuracy']) # Accuracy performance metric
history = network.fit(x_train_noisy, # Features
x_train_clean, # Target vector
epochs=3, # Number of epochs
verbose=0, # No output
batch_size=100, # Number of observations per batch
shuffle=True,) # training data will be randomly shuffled at each epoch
輸出錯誤非常清楚地指出:
tensorflow.python.framework.errors_impl.InvalidArgumentError:不兼容的形狀:[100,530]與[100,3714] [[{{node loss_1 / dropout_9_loss / sub}}]]無法從較低尺寸拾取到較高尺寸。
您看到的錯誤與網絡的學習能力無關。 network.summary()揭示了輸出形狀為(None,530),而輸入形狀為(None,3714)導致訓練時出錯。
在訓練過程中導致錯誤的輸入:
x_train_noisy = np.zeros([100, 3714]) #just to test
x_train_clean = np.ones([100, 3714])
tensorflow.python.framework.errors_impl.InvalidArgumentError: Incompatible shapes: [100,530] vs. [100,3714]
訓練無誤的輸入:
x_train_noisy = np.zeros([100, 3714]) #just to test
x_train_clean = np.ones([100, 530])
100/100 [==============================] - 1s 11ms/step - loss: 1.0000 - acc: 1.0000
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.