繁体   English   中英

为什么我的 model 性能表现如此缓慢?

[英]why my model performance performing so slow?

我有这个带有 3 块 VGG 架构的 CNN model

import tensorflow as tf
from tensorflow.keras import datasets, layers, models
import matplotlib.pyplot as plt
from keras.preprocessing import image
from keras.preprocessing.image import ImageDataGenerator
from keras.regularizers import L2, L1, L1L2
from keras.optimizers import SGD, Adam, Adagrad, RMSprop
from keras.models import load_model, Model
import numpy as np
import keras as k

#Load data dan split data
(train_images, train_labels),(test_images, test_labels) = datasets.cifar10.load_data()

#Normalize Data
train_images = train_images / 255.0
test_images = test_images / 255.0

#Convert menjadi one-hot-encode
num_classes = 10
train_labels = k.utils.to_categorical(train_labels, num_classes)
test_labels = k.utils.to_categorical(test_labels, num_classes)

# Data Augmentation
datagen = ImageDataGenerator(
width_shift_range=0.1,
height_shift_range=0.1,
horizontal_flip=True,)

datagen.fit(train_images)

reg=None
num_filters=32
ac='relu'
adm=Adam(lr=0.001,decay=0, beta_1=0.9, beta_2=0.999, epsilon=1e-08)
sgd=SGD(lr=0.01, momentum=0.9)
rms=RMSprop(lr=0.0001,decay=1e-6)
agr=Adagrad(learning_rate=0.0001,initial_accumulator_value=0.1,epsilon=1e-08)
opt=adm
drop_dense=0.5
drop_conv=0.2

model = models.Sequential()

model.add(layers.Conv2D(num_filters, (3, 3), activation=ac, kernel_regularizer=reg, input_shape=(32, 32, 3),padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.Conv2D(num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.MaxPooling2D(pool_size=(2, 2)))   
model.add(layers.Dropout(drop_conv))

model.add(layers.Conv2D(2*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.Conv2D(2*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.MaxPooling2D(pool_size=(2, 2)))   
model.add(layers.Dropout(2 * drop_conv))

model.add(layers.Conv2D(4*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.Conv2D(4*num_filters, (3, 3), activation=ac,kernel_regularizer=reg,padding='same'))
model.add(layers.BatchNormalization())
model.add(layers.MaxPooling2D(pool_size=(2, 2)))
model.add(layers.Dropout(3 * drop_conv))

model.add(layers.Flatten())
model.add(layers.Dense(512, activation=ac,kernel_regularizer=reg))
model.add(layers.BatchNormalization())
model.add(layers.Dropout(drop_dense))
model.add(layers.Dense(num_classes, activation='softmax'))


model.compile(loss='categorical_crossentropy', metrics=['accuracy'],optimizer=adm)

model.summary()


history=model.fit_generator(datagen.flow(train_images, train_labels, batch_size=256),
                    steps_per_epoch = len(train_images) / 256, epochs=200, 
                            validation_data=(test_images, test_labels))

loss, accuracy = model.evaluate(test_images, test_labels)
print("Accuracy is : ", accuracy * 100)
print("Loss is : ", loss)


N = 200
plt.style.use("ggplot")
plt.figure()
plt.plot(np.arange(0, N), history.history["loss"], label="train_loss")
plt.plot(np.arange(0, N), history.history["val_loss"], label="val_loss")
plt.plot(np.arange(0, N), history.history["accuracy"], label="train_acc")
plt.plot(np.arange(0, N), history.history["val_accuracy"], label="val_acc")
plt.title("Training Loss and Accuracy")
plt.xlabel("Epochs")
plt.ylabel("Loss/Accuracy")
plt.legend(loc="upper left")
plt.show()

model.save("model_test_9.h5") # serialize weights to HDF5
FileLink(r'model_test_9.h5')

# ADM Improve Dropout dataaugment

output:

Epoch 40/200
195/195 [==============================] - 21s 107ms/step - loss: 0.4334 - accuracy: 0.8507 - val_loss: 0.5041 - val_accuracy: 0.8357
Epoch 41/200
195/195 [==============================] - 21s 107ms/step - loss: 0.4289 - accuracy: 0.8522 - val_loss: 0.5354 - val_accuracy: 0.8284
Epoch 42/200
195/195 [==============================] - 21s 110ms/step - loss: 0.4333 - accuracy: 0.8490 - val_loss: 0.4560 - val_accuracy: 0.8499: 0.4334 - ac - ETA: 1s - loss:
Epoch 43/200
195/195 [==============================] - 21s 110ms/step - loss: 0.4198 - accuracy: 0.8555 - val_loss: 0.4817 - val_accuracy: 0.8429
Epoch 44/200
195/195 [==============================] - 21s 107ms/step - loss: 0.4130 - accuracy: 0.8556 - val_loss: 0.4768 - val_accuracy: 0.8407ccuracy: 0. - ETA: 5s - los
Epoch 45/200
195/195 [==============================] - 21s 109ms/step - loss: 0.4180 - accuracy: 0.8544 - val_loss: 0.4526 - val_accuracy: 0.8483 accuracy
Epoch 46/200
195/195 [==============================] - 21s 108ms/step - loss: 0.4113 - accuracy: 0.8565 - val_loss: 0.4129 - val_accuracy: 0.8618
Epoch 47/200
195/195 [==============================] - 21s 108ms/step - loss: 0.4078 - accuracy: 0.8584 - val_loss: 0.4108 - val_accuracy: 0.8659
Epoch 48/200
195/195 [==============================] - 21s 109ms/step - loss: 0.4184 - accuracy: 0.8538 - val_loss: 0.4370 - val_accuracy: 0.8557
Epoch 49/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3926 - accuracy: 0.8641 - val_loss: 0.3817 - val_accuracy: 0.8685
Epoch 50/200
195/195 [==============================] - 21s 109ms/step - loss: 0.4044 - accuracy: 0.8587 - val_loss: 0.4225 - val_accuracy: 0.8571
Epoch 51/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3919 - accuracy: 0.8640 - val_loss: 0.4101 - val_accuracy: 0.8625
Epoch 52/200
195/195 [==============================] - 21s 106ms/step - loss: 0.4035 - accuracy: 0.8623 - val_loss: 0.4341 - val_accuracy: 0.8561059 - accuracy: 0.86 - ETA: 8s - loss: 0.4059 - accuracy:  - ETA: 7s - loss: 0.4057 - ac - ETA: 7s - loss: 0.4054 - ac - ETA: 6s - - ETA: 0s - loss: 0.4036 - accura
Epoch 53/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3963 - accuracy: 0.8619 - val_loss: 0.4180 - val_accuracy: 0.8576
Epoch 54/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3901 - accuracy: 0.8635 - val_loss: 0.3744 - val_accuracy: 0.8712
Epoch 55/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3917 - accuracy: 0.8640 - val_loss: 0.3751 - val_accuracy: 0.87363909 - accu - ETA: 2s - loss: 0.3911 - ac
Epoch 56/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3795 - accuracy: 0.8679 - val_loss: 0.4697 - val_accuracy: 0.8445ss: 0.3764 - ac - ETA: 15s - loss: 0.3758 - accuracy:
Epoch 57/200
195/195 [==============================] - 22s 111ms/step - loss: 0.3844 - accuracy: 0.8656 - val_loss: 0.4058 - val_accuracy: 0.8620- los - ETA: 0s - loss: 0.3842 - accuracy - ETA: 0s - loss: 0.3843 - accura
Epoch 58/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3864 - accuracy: 0.8656 - val_loss: 0.4226 - val_accuracy: 0.8588
Epoch 59/200
195/195 [==============================] - 22s 110ms/step - loss: 0.3821 - accuracy: 0.8684 - val_loss: 0.3986 - val_accuracy: 0.8666
Epoch 60/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3728 - accuracy: 0.8708 - val_loss: 0.4196 - val_accuracy: 0.8638
Epoch 61/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3724 - accuracy: 0.8699 - val_loss: 0.3928 - val_accuracy: 0.8654loss: 0 - ETA: 3s - loss: 0.3 -
Epoch 62/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3724 - accuracy: 0.8712 - val_loss: 0.3615 - val_accuracy: 0.8782
Epoch 63/200
195/195 [==============================] - 22s 110ms/step - loss: 0.3758 - accuracy: 0.8691 - val_loss: 0.3976 - val_accuracy: 0.8707
Epoch 64/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3698 - accuracy: 0.8714 - val_loss: 0.4429 - val_accuracy: 0.8554
Epoch 65/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3570 - accuracy: 0.8750 - val_loss: 0.3702 - val_accuracy: 0.8740
Epoch 66/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3588 - accuracy: 0.8751 - val_loss: 0.3885 - val_accuracy: 0.8717
Epoch 67/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3597 - accuracy: 0.8749 - val_loss: 0.3781 - val_accuracy: 0.8777
Epoch 68/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3590 - accuracy: 0.8756 - val_loss: 0.4230 - val_accuracy: 0.8613
Epoch 69/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3540 - accuracy: 0.8756 - val_loss: 0.3972 - val_accuracy: 0.8694
Epoch 70/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3588 - accuracy: 0.8729 - val_loss: 0.4242 - val_accuracy: 0.8598
Epoch 71/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3608 - accuracy: 0.8748 - val_loss: 0.3887 - val_accuracy: 0.8683
Epoch 72/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3511 - accuracy: 0.8783 - val_loss: 0.3912 - val_accuracy: 0.8716
Epoch 73/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3516 - accuracy: 0.8769 - val_loss: 0.4673 - val_accuracy: 0.8515
Epoch 74/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3484 - accuracy: 0.8787 - val_loss: 0.3990 - val_accuracy: 0.8664
Epoch 75/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3506 - accuracy: 0.8780 - val_loss: 0.3869 - val_accuracy: 0.8666
Epoch 76/200
195/195 [==============================] - 20s 105ms/step - loss: 0.3484 - accuracy: 0.8795 - val_loss: 0.3447 - val_accuracy: 0.8853
Epoch 77/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3493 - accuracy: 0.8774 - val_loss: 0.3644 - val_accuracy: 0.8794
Epoch 78/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3443 - accuracy: 0.8813 - val_loss: 0.4117 - val_accuracy: 0.8665
Epoch 79/200
195/195 [==============================] - 20s 104ms/step - loss: 0.3436 - accuracy: 0.8796 - val_loss: 0.3695 - val_accuracy: 0.8758
Epoch 80/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3487 - accuracy: 0.8788 - val_loss: 0.3583 - val_accuracy: 0.8789
Epoch 81/200
accuracy:  - ETA: 1s - los
Epoch 92/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3320 - accuracy: 0.8834 - val_loss: 0.3658 - val_accuracy: 0.8794
Epoch 93/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3251 - accuracy: 0.8858 - val_loss: 0.4003 - val_accuracy: 0.8646
Epoch 94/200
195/195 [==============================] - 20s 103ms/step - loss: 0.3202 - accuracy: 0.8894 - val_loss: 0.3943 - val_accuracy: 0.8695
Epoch 95/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3238 - accuracy: 0.8887 - val_loss: 0.3232 - val_accuracy: 0.8931
Epoch 96/200
195/195 [==============================] - 21s 105ms/step - loss: 0.3236 - accuracy: 0.8881 - val_loss: 0.3659 - val_accuracy: 0.8777
Epoch 97/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3116 - accuracy: 0.8912 - val_loss: 0.4218 - val_accuracy: 0.8634
Epoch 98/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3189 - accuracy: 0.8893 - val_loss: 0.3783 - val_accuracy: 0.8740s - loss: 0.3189 - accuracy - ETA: 0s - loss: 0.3189 - ac
Epoch 99/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3260 - accuracy: 0.8845 - val_loss: 0.3418 - val_accuracy: 0.8875
Epoch 100/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3143 - accuracy: 0.8893 - val_loss: 0.3974 - val_accuracy: 0.8671loss: 0.3141 - accu - ETA: 0s - loss: 0.314
Epoch 101/200
195/195 [==============================] - 20s 105ms/step - loss: 0.3209 - accuracy: 0.8898 - val_loss: 0.3688 - val_accuracy: 0.8780
Epoch 102/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3205 - accuracy: 0.8885 - val_loss: 0.3689 - val_accuracy: 0.8791
Epoch 103/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3157 - accuracy: 0.8884 - val_loss: 0.3420 - val_accuracy: 0.8857
Epoch 104/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3163 - accuracy: 0.8878 - val_loss: 0.3580 - val_accuracy: 0.8821
Epoch 105/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3105 - accuracy: 0.8915 - val_loss: 0.3696 - val_accuracy: 0.8800
Epoch 106/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3127 - accuracy: 0.8893 - val_loss: 0.3701 - val_accuracy: 0.8799
Epoch 107/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3087 - accuracy: 0.8917 - val_loss: 0.3604 - val_accuracy: 0.8831
Epoch 108/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3097 - accuracy: 0.8916 - val_loss: 0.3311 - val_accuracy: 0.8923
Epoch 109/200
195/195 [==============================] - 21s 106ms/step - loss: 0.3096 - accuracy: 0.8907 - val_loss: 0.3421 - val_accuracy: 0.8880
Epoch 110/200
195/195 [==============================] - 21s 110ms/step - loss: 0.3082 - accuracy: 0.8925 - val_loss: 0.3207 - val_accuracy: 0.8933
Epoch 111/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2997 - accuracy: 0.8967 - val_loss: 0.3400 - val_accuracy: 0.8858
Epoch 112/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3026 - accuracy: 0.8929 - val_loss: 0.3821 - val_accuracy: 0.8769
Epoch 113/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2996 - accuracy: 0.8940 - val_loss: 0.3453 - val_accuracy: 0.886193 - ac
Epoch 114/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3033 - accuracy: 0.8935 - val_loss: 0.3850 - val_accuracy: 0.8733
Epoch 115/200
195/195 [==============================] - 21s 108ms/step - loss: 0.3046 - accuracy: 0.8942 - val_loss: 0.3396 - val_accuracy: 0.8880
Epoch 116/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2998 - accuracy: 0.8946 - val_loss: 0.3496 - val_accuracy: 0.8826
Epoch 117/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3100 - accuracy: 0.8914 - val_loss: 0.4213 - val_accuracy: 0.8632
Epoch 118/200
195/195 [==============================] - 21s 107ms/step - loss: 0.3099 - accuracy: 0.8905 - val_loss: 0.3623 - val_accuracy: 0.8787- l -
Epoch 119/200
195/195 [==============================] - 22s 110ms/step - loss: 0.3096 - accuracy: 0.8929 - val_loss: 0.3523 - val_accuracy: 0.8841ss: 0.3098 - accu - ETA: 3s - loss: 0.3097 - ac - ETA: 2s - loss: 0.3097 - accu - ETA: 1s - loss: 0.3097 - accuracy - ETA: 1s -
Epoch 120/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2990 - accuracy: 0.8952 - val_loss: 0.3645 - val_accuracy: 0.8803
Epoch 121/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2986 - accuracy: 0.8940 - val_loss: 0.3947 - val_accuracy: 0.8701
Epoch 122/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3002 - accuracy: 0.8934 - val_loss: 0.3854 - val_accuracy: 0.8746
Epoch 123/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2957 - accuracy: 0.8962 - val_loss: 0.3649 - val_accuracy: 0.8787
Epoch 124/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2926 - accuracy: 0.8967 - val_loss: 0.3245 - val_accuracy: 0.8948
Epoch 125/200
195/195 [==============================] - 21s 109ms/step - loss: 0.3024 - accuracy: 0.8933 - val_loss: 0.3376 - val_accuracy: 0.8896
Epoch 126/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2904 - accuracy: 0.8984 - val_loss: 0.3394 - val_accuracy: 0.8867
Epoch 127/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2974 - accuracy: 0.8974 - val_loss: 0.3591 - val_accuracy: 0.8842
Epoch 128/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2942 - accuracy: 0.8978 - val_loss: 0.3455 - val_accuracy: 0.8848
Epoch 129/200
195/195 [==============================] - 21s 106ms/step - loss: 0.2940 - accuracy: 0.8970 - val_loss: 0.3400 - val_accuracy: 0.8883
Epoch 130/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2973 - accuracy: 0.8973 - val_loss: 0.3286 - val_accuracy: 0.8905
Epoch 131/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2903 - accuracy: 0.8948 - val_loss: 0.4064 - val_accuracy: 0.8707
Epoch 132/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2962 - accuracy: 0.8963 - val_loss: 0.3689 - val_accuracy: 0.8773
Epoch 133/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2918 - accuracy: 0.8971 - val_loss: 0.3666 - val_accuracy: 0.8808
Epoch 134/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2894 - accuracy: 0.8991 - val_loss: 0.3306 - val_accuracy: 0.8918
Epoch 135/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2809 - accuracy: 0.9020 - val_loss: 0.3157 - val_accuracy: 0.8940
Epoch 136/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2878 - accuracy: 0.8996 - val_loss: 0.3568 - val_accuracy: 0.8847
Epoch 137/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2903 - accuracy: 0.8981 - val_loss: 0.3422 - val_accuracy: 0.8914
Epoch 138/200
195/195 [==============================] - 20s 104ms/step - loss: 0.2841 - accuracy: 0.8986 - val_loss: 0.3276 - val_accuracy: 0.8910
Epoch 139/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2892 - accuracy: 0.8994 - val_loss: 0.3350 - val_accuracy: 0.8909
Epoch 140/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2863 - accuracy: 0.9000 - val_loss: 0.3634 - val_accuracy: 0.8817
Epoch 141/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2884 - accuracy: 0.8983 - val_loss: 0.3368 - val_accuracy: 0.8903
Epoch 142/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2903 - accuracy: 0.8988 - val_loss: 0.3643 - val_accuracy: 0.8820
Epoch 143/200
195/195 [==============================] - 21s 105ms/step - loss: 0.2818 - accuracy: 0.8997 - val_loss: 0.3178 - val_accuracy: 0.8933
Epoch 144/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2713 - accuracy: 0.9042 - val_loss: 0.3584 - val_accuracy: 0.88400
Epoch 145/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2907 - accuracy: 0.8990 - val_loss: 0.3286 - val_accuracy: 0.8921loss: 0.2911 - accura - ETA
Epoch 146/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2745 - accuracy: 0.9045 - val_loss: 0.3450 - val_accuracy: 0.8890
Epoch 147/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2816 - accuracy: 0.9028 - val_loss: 0.3895 - val_accuracy: 0.8715
Epoch 148/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2777 - accuracy: 0.9041 - val_loss: 0.3372 - val_accuracy: 0.8896- loss: 0.2776 - accuracy: 
Epoch 149/200
195/195 [==============================] - 21s 105ms/step - loss: 0.2700 - accuracy: 0.9070 - val_loss: 0.3615 - val_accuracy: 0.8803
Epoch 150/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2741 - accuracy: 0.9033 - val_loss: 0.3605 - val_accuracy: 0.8813
Epoch 151/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2890 - accuracy: 0.8979 - val_loss: 0.3490 - val_accuracy: 0.8854
Epoch 152/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2784 - accuracy: 0.9008 - val_loss: 0.3543 - val_accuracy: 0.8838s - los
Epoch 153/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2803 - accuracy: 0.9014 - val_loss: 0.3356 - val_accuracy: 0.8876
Epoch 154/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2719 - accuracy: 0.9031 - val_loss: 0.3338 - val_accuracy: 0.8894
Epoch 155/200
195/195 [==============================] - 21s 106ms/step - loss: 0.2830 - accuracy: 0.9019 - val_loss: 0.3505 - val_accuracy: 0.8893
Epoch 156/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2830 - accuracy: 0.9002 - val_loss: 0.3173 - val_accuracy: 0.8983
Epoch 157/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2764 - accuracy: 0.9015 - val_loss: 0.3789 - val_accuracy: 0.8765
Epoch 158/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2742 - accuracy: 0.9040 - val_loss: 0.3245 - val_accuracy: 0.8941
Epoch 159/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2801 - accuracy: 0.9014 - val_loss: 0.3342 - val_accuracy: 0.8905
Epoch 160/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2640 - accuracy: 0.9064 - val_loss: 0.3632 - val_accuracy: 0.8818
Epoch 161/200
195/195 [==============================] - 21s 106ms/step - loss: 0.2754 - accuracy: 0.9026 - val_loss: 0.3204 - val_accuracy: 0.8936
Epoch 162/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2745 - accuracy: 0.9040 - val_loss: 0.3921 - val_accuracy: 0.8769
Epoch 163/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2731 - accuracy: 0.9031 - val_loss: 0.3234 - val_accuracy: 0.8939
Epoch 164/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2699 - accuracy: 0.9062 - val_loss: 0.3466 - val_accuracy: 0.8873
Epoch 165/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2866 - accuracy: 0.9002 - val_loss: 0.3669 - val_accuracy: 0.8820
Epoch 166/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2657 - accuracy: 0.9058 - val_loss: 0.3208 - val_accuracy: 0.8930
Epoch 167/200
195/195 [==============================] - 20s 105ms/step - loss: 0.2769 - accuracy: 0.9014 - val_loss: 0.3339 - val_accuracy: 0.8912
Epoch 168/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2739 - accuracy: 0.9037 - val_loss: 0.3357 - val_accuracy: 0.8885
Epoch 169/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2739 - accuracy: 0.9059 - val_loss: 0.4047 - val_accuracy: 0.8727
Epoch 170/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2666 - accuracy: 0.9063 - val_loss: 0.3386 - val_accuracy: 0.8904
Epoch 171/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2660 - accuracy: 0.9073 - val_loss: 0.3169 - val_accuracy: 0.8945
Epoch 172/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2692 - accuracy: 0.9054 - val_loss: 0.3413 - val_accuracy: 0.8859
Epoch 173/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2672 - accuracy: 0.9050 - val_loss: 0.3230 - val_accuracy: 0.8930
Epoch 174/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2776 - accuracy: 0.9026 - val_loss: 0.3204 - val_accuracy: 0.8966
Epoch 175/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2646 - accuracy: 0.9073 - val_loss: 0.3433 - val_accuracy: 0.8937
Epoch 176/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2670 - accuracy: 0.9057 - val_loss: 0.3301 - val_accuracy: 0.8927
Epoch 177/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2697 - accuracy: 0.9046 - val_loss: 0.3110 - val_accuracy: 0.8979
Epoch 178/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2711 - accuracy: 0.9043 - val_loss: 0.3240 - val_accuracy: 0.8944712 - accuracy: 0. - ETA: 
Epoch 179/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2628 - accuracy: 0.9072 - val_loss: 0.3265 - val_accuracy: 0.8931
Epoch 180/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2642 - accuracy: 0.9070 - val_loss: 0.3192 - val_accuracy: 0.8954
Epoch 181/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2626 - accuracy: 0.9067 - val_loss: 0.3404 - val_accuracy: 0.8875
Epoch 182/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2635 - accuracy: 0.9080 - val_loss: 0.3463 - val_accuracy: 0.8874
Epoch 183/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2630 - accuracy: 0.9075 - val_loss: 0.3342 - val_accuracy: 0.8909
Epoch 184/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2666 - accuracy: 0.9036 - val_loss: 0.2964 - val_accuracy: 0.9011
Epoch 185/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2671 - accuracy: 0.9067 - val_loss: 0.3400 - val_accuracy: 0.8905
Epoch 186/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2625 - accuracy: 0.9084 - val_loss: 0.3446 - val_accuracy: 0.8889
Epoch 187/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2606 - accuracy: 0.9097 - val_loss: 0.3242 - val_accuracy: 0.8955
Epoch 188/200
195/195 [==============================] - 22s 111ms/step - loss: 0.2588 - accuracy: 0.9094 - val_loss: 0.3240 - val_accuracy: 0.8958
Epoch 189/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2649 - accuracy: 0.9070 - val_loss: 0.3216 - val_accuracy: 0.8980
Epoch 190/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2587 - accuracy: 0.9077 - val_loss: 0.3403 - val_accuracy: 0.8891
Epoch 191/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2678 - accuracy: 0.9033 - val_loss: 0.3099 - val_accuracy: 0.9008
Epoch 192/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2538 - accuracy: 0.9094 - val_loss: 0.3170 - val_accuracy: 0.8968
Epoch 193/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2613 - accuracy: 0.9076 - val_loss: 0.2916 - val_accuracy: 0.9046
Epoch 194/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2651 - accuracy: 0.9077 - val_loss: 0.3159 - val_accuracy: 0.8968
Epoch 195/200
195/195 [==============================] - 22s 110ms/step - loss: 0.2576 - accuracy: 0.9097 - val_loss: 0.3446 - val_accuracy: 0.8901
Epoch 196/200
195/195 [==============================] - 21s 109ms/step - loss: 0.2554 - accuracy: 0.9094 - val_loss: 0.3227 - val_accuracy: 0.8978curacy: 
Epoch 197/200
195/195 [==============================] - 21s 108ms/step - loss: 0.2620 - accuracy: 0.9090 - val_loss: 0.3174 - val_accuracy: 0.8958
Epoch 198/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2583 - accuracy: 0.9082 - val_loss: 0.3186 - val_accuracy: 0.8964
Epoch 199/200
195/195 [==============================] - 21s 107ms/step - loss: 0.2546 - accuracy: 0.9103 - val_loss: 0.3183 - val_accuracy: 0.8968
Epoch 200/200
195/195 [==============================] - 21s 110ms/step - loss: 0.2544 - accuracy: 0.9082 - val_loss: 0.3327 - val_accuracy: 0.8948
313/313 [==============================] - 1s 3ms/step - loss: 0.3327 - accuracy: 0.8948
Accuracy is :  89.48000073432922
Loss is :  0.3326900005340576

(我削减了前 40 个 epoch,bcs 正文限制为 30000 个字符,1-40 个 epoch 不断改进,但速度很慢)我尝试了 100 个 epoch,它给了我在这段代码中验证准确度约为 88% 的结果我再添加了 100 个 epoch它只给了我 1% 的改进(~89%)

我的问题是,

  1. 我的 model 是否执行过拟合 model?
  2. 为什么我的 model 执行得这么慢?
  3. 如果我添加更多时期,我的 model 可以改进吗?
  4. 如何提高准确性并减少损失,因为它对我来说似乎停滞不前?

Model 性能 plot 在这里

  1. 否 - 因为估值损失没有增加
  2. 你的情节看起来不错。 预计训练过程会变慢
  3. 是的,但这没有意义。 如果您将任何 model 训练为无穷大 - 它的性能将永久提高 - 例如,如果您训练一年,您可以获得 89.5% 的准确率(优于 89.48%)。
  4. 尝试使用不同的时间表衰减学习率

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM