简体   繁体   English

MNIST的Keras VGG模型:训练和验证准确性之间的差异

[英]Keras VGG model for MNIST: Disparity between training and validation accuracy

I have created the following model with Keras. 我用Keras创建了以下模型。 The dataset is MNIST. 数据集是MNIST。

'''
    conv - relu - conv- relu - pool -
    conv - relu - conv- relu - pool -
    conv - relu - conv- relu - pool -
    affine - relu - dropout - affine - dropout - softmax
'''

model = Sequential()
model.add(Conv2D(16, kernel_size=(3, 3),
                 padding='same',
                 input_shape=input_shape)) 
model.add(Activation('relu'))
model.add(Conv2D(16, (3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2))) 
model.add(Conv2D(32, (3, 3), padding='same', activation='relu'))
model.add(Conv2D(32, (3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Conv2D(64, (3, 3), padding='same', activation='relu'))
model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())
model.add(Dense(50, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(num_classes))
model.add(Dropout(0.5))
model.add(Activation('softmax'))

The following is the result: 结果如下:

60000/60000 [==============================] - 10s - loss: 1.2707 - acc: 0.5059 - val_loss: 0.0881 - val_acc: 0.9785                    
Epoch 2/20                                                                                                                              
60000/60000 [==============================] - 9s - loss: 0.9694 - acc: 0.5787 - val_loss: 0.0449 - val_acc: 0.9873                                    

...        

Epoch 19/20                                                         
60000/60000 [==============================] - 9s - loss: 0.8530 - acc: 0.6004 - val_loss: 0.0282 - val_acc: 0.9937                     
Epoch 20/20                       
60000/60000 [==============================] - 9s - loss: 0.8564 - acc: 0.5982 - val_loss: 0.0383 - val_acc: 0.9910                     
Test loss: 0.0382921607383        
Test accuracy: 0.991                    

Why is the training accuracy so low, while the validation accururacy is so high? 为什么培训准确性如此之低,而验证的准确性如此之高?

The dropout on your last Dense layer removes half of your 10 neurons for your classes by random. 最后一个Dense图层上的丢失会随机删除你的10个神经元中的一半。 Your last layer can only by accurate half of the times because in general half of the neurons are missing. 你的最后一层只能准确地减半,因为一般来说有一半的神经元缺失了。

Try to remove that and I assume you get even values. 尝试删除它,我假设你得到均值。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 基于Keras MNIST实例训练CNN暹罗网络时,验证准确性保持在0.50 - Validation accuracy stuck at 0.50 when training CNN siamese network based on Keras MNIST example Keras 和 VGG 培训:为什么我会“丢失”遵循 model.predict_generator 的培训和验证示例 - Keras and VGG training: why do I "lose" training and validation examples following model.predict_generator Keras 模型不是训练层,验证准确率始终为 0.5 - Keras model not training layers, validation accuracy always 0.5 从 tensorflow 和 keras 中的 mnist 训练 model 的问题 - problem with training model from mnist in tensorflow and keras Keras 模型在训练时返回很高的验证准确度,但在评估时准确度非常低 - Keras model returns high validation accuracy while training, but accuracy is very low while evaluating VGG Model 的 Output 在训练后变得恒定,并且损失/准确性没有提高 - Output of VGG Model becomes constant after training and Loss/Accuracy are not improving Keras - 绘制训练、验证和测试集准确性 - Keras - Plot training, validation and test set accuracy 验证和训练准确性很高,在第一个时期[Keras] - Validation and training accuracy high in the first epoch [Keras] 训练精度高,验证精度低 CNN二元分类 keras - High training accuracy, low validation accuracy CNN binary classification keras 在 Keras 中,验证准确率始终高于训练准确率 - Validation accuracy is always greater than training accuracy in Keras
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM