繁体   English   中英

在用Keras训练单个批次的过程中,准确性降低了吗?

[英]Accuracy decreases during training of a single batch with Keras?

通常,当使用Keras训练深度神经网络时,在单批次训练期间,训练精度会提高。

像这样

2019-08-03 13:33:22 PST10/189 [>.............................] - ETA: 9s - loss: 0.6919 - acc: 0.8000
2019-08-03 13:33:22 PST20/189 [==>...........................] - ETA: 4s - loss: 0.6905 - acc: 0.9000
2019-08-03 13:33:22 PST40/189 [=====>........................] - ETA: 2s - loss: 0.6879 - acc: 0.9500
2019-08-03 13:33:22 PST60/189 [========>.....................] - ETA: 1s - loss: 0.6852 - acc: 0.9667
2019-08-03 13:33:22 PST80/189 [===========>..................] - ETA: 1s - loss: 0.6821 - acc: 0.9750
2019-08-03 13:33:22 PST90/189 [=============>................] - ETA: 1s - loss: 0.6806 - acc: 0.9778
2019-08-03 13:33:22 PST100/189 [==============>...............] - ETA: 0s - loss: 0.6785 - acc: 0.9800
2019-08-03 13:33:22 PST120/189 [==================>...........] - ETA: 0s - loss: 0.6764 - acc: 0.9667
2019-08-03 13:33:23 PST130/189 [===================>..........] - ETA: 0s - loss: 0.6743 - acc: 0.9692
2019-08-03 13:33:23 PST140/189 [=====================>........] - ETA: 0s - loss: 0.6721 - acc: 0.9714
2019-08-03 13:33:23 PST160/189 [========================>.....] - ETA: 0s - loss: 0.6691 - acc: 0.9688
2019-08-03 13:33:23 PST180/189 [===========================>..] - ETA: 0s - loss: 0.6650 - acc: 0.9722
2019-08-03 13:33:23 PST189/189 [==============================] - 1s 8ms/step - loss: 0.6630 - acc: 0.9735
2019-08-03 13:33:32 PSTEpoch 1/1

但是有时(通常在以后的批次中)精度会降低,

2019-08-03 13:51:37 PST10/190 [>.............................] - ETA: 1s - loss: 0.0114 - acc: 1.0000
2019-08-03 13:51:37 PST20/190 [==>...........................] - ETA: 0s - loss: 0.0073 - acc: 1.0000
2019-08-03 13:51:37 PST30/190 [===>..........................] - ETA: 0s - loss: 0.0067 - acc: 1.0000
2019-08-03 13:51:37 PST40/190 [=====>........................] - ETA: 0s - loss: 0.0105 - acc: 1.0000
2019-08-03 13:51:37 PST50/190 [======>.......................] - ETA: 0s - loss: 0.0785 - acc: 0.9800
2019-08-03 13:51:37 PST60/190 [========>.....................] - ETA: 0s - loss: 0.0729 - acc: 0.9833
2019-08-03 13:51:37 PST70/190 [==========>...................] - ETA: 0s - loss: 0.0632 - acc: 0.9857
2019-08-03 13:51:37 PST80/190 [===========>..................] - ETA: 0s - loss: 0.1083 - acc: 0.9750
2019-08-03 13:51:37 PST90/190 [=============>................] - ETA: 0s - loss: 0.1396 - acc: 0.9667
2019-08-03 13:51:37 PST100/190 [==============>...............] - ETA: 0s - loss: 0.1291 - acc: 0.9700
2019-08-03 13:51:37 PST110/190 [================>.............] - ETA: 0s - loss: 0.1180 - acc: 0.9727
2019-08-03 13:51:37 PST120/190 [=================>............] - ETA: 0s - loss: 0.1133 - acc: 0.9750
2019-08-03 13:51:38 PST130/190 [===================>..........] - ETA: 0s - loss: 0.1050 - acc: 0.9769
2019-08-03 13:51:38 PST140/190 [=====================>........] - ETA: 0s - loss: 0.0980 - acc: 0.9786
2019-08-03 13:51:38 PST150/190 [======================>.......] - ETA: 0s - loss: 0.0923 - acc: 0.9800
2019-08-03 13:51:38 PST160/190 [========================>.....] - ETA: 0s - loss: 0.0866 - acc: 0.9812
2019-08-03 13:51:38 PST170/190 [=========================>....] - ETA: 0s - loss: 0.0848 - acc: 0.9824
2019-08-03 13:51:38 PST180/190 [===========================>..] - ETA: 0s - loss: 0.0802 - acc: 0.9833
2019-08-03 13:51:38 PST190/190 [==============================] - 1s 5ms/step - loss: 0.0762 - acc: 0.9842

我确实将测试搁置一旁,因此我不必为此担心。 我只是想知道,当模型针对单个批次进行优化时,这怎么可能呢?

该模型供参考:

model = Sequential()
model.add(Dense(12, input_dim=191226, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

...

model.fit(X_train_a, y_train_a, epochs=1, batch_size=10)

您在进度栏中看到的准确度实际上是每个批次准确度的平均值,因此它可以上升或下降,该模型不一定必须在每个批次中都具有增加的准确度。

还要注意的是,损耗被最小化,而不是直接降低精度,并且完全有可能使损耗降低而精度降低,但是通常随着损耗降低,精度提高。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM