[英]How to create autoencoder using dropout in Dense layers using Keras
我正在尝试重建数字数据集的输出,为此我正在尝试使用自动编码器的不同方法。 一种方法是在密集层中使用辍学。
如图所示,使用编码器和解码器的两个部分,尺寸在中心减小。 这就是问题开始的地方,因为带有Dropout的密集层不会拾取。
请注意,这是我尝试自动编码器的第二种方法,如此处所示,我已经完成了 。
这是我(天真的)写的:
from keras import models
from keras import layers
from keras import backend as K
network = models.Sequential()
input_shape = x_train_clean.shape[1] # input_shape = 3714
outer_layer = int(input_shape / 7)
inner_layer = int(input_shape / 14)
network.add(Dropout(0.2, input_shape=(input_shape,)))
network.add(Dense(units=outer_layer, activation='relu'))
network.add(Dropout(0.2))
network.add(Dense(units=inner_layer, activation='relu'))
network.add(Dropout(0.2))
network.add(Dense(units=10, activation='linear'))
network.add(Dropout(0.2))
network.add(Dense(units=inner_layer, activation='relu'))
network.add(Dropout(0.2))
network.add(Dense(units=outer_layer, activation='relu'))
network.add(Dropout(0.2))
network.compile(loss=lambda true, pred: K.sqrt(K.mean(K.square(pred-true))), # RMSE
optimizer='rmsprop', # Root Mean Square Propagation
metrics=['accuracy']) # Accuracy performance metric
history = network.fit(x_train_noisy, # Features
x_train_clean, # Target vector
epochs=3, # Number of epochs
verbose=0, # No output
batch_size=100, # Number of observations per batch
shuffle=True,) # training data will be randomly shuffled at each epoch
输出错误非常清楚地指出:
tensorflow.python.framework.errors_impl.InvalidArgumentError:不兼容的形状:[100,530]与[100,3714] [[{{node loss_1 / dropout_9_loss / sub}}]]无法从较低尺寸拾取到较高尺寸。
您看到的错误与网络的学习能力无关。 network.summary()揭示了输出形状为(None,530),而输入形状为(None,3714)导致训练时出错。
在训练过程中导致错误的输入:
x_train_noisy = np.zeros([100, 3714]) #just to test
x_train_clean = np.ones([100, 3714])
tensorflow.python.framework.errors_impl.InvalidArgumentError: Incompatible shapes: [100,530] vs. [100,3714]
训练无误的输入:
x_train_noisy = np.zeros([100, 3714]) #just to test
x_train_clean = np.ones([100, 530])
100/100 [==============================] - 1s 11ms/step - loss: 1.0000 - acc: 1.0000
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.