繁体   English   中英

损失不失格,是非常高的keras

[英]Loss not decrasing and is very high keras

我正在学习喀拉拉邦的深度学习,但遇到了问题。 损失没有减少,而且非常高,约为650。

我正在从tensorflow.keras.datasets.mnist处理MNIST数据集,没有错误,只是我的NN没有学习。

有我的模型:

from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
import tensorflow.nn as tfnn

inputdim = 28 * 28

model = Sequential()

model.add(Flatten())
model.add(Dense(inputdim, activation = tfnn.relu))
model.add(Dense(128, activation = tfnn.relu))
model.add(Dense(10, activation = tfnn.softmax))

model.compile(loss = 'categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])
model.fit(X_train, Y_train, epochs = 4)

和我的输出:

Epoch 1/4
60000/60000 [==============================] - 32s 527us/sample - loss: 646.0926 - acc: 6.6667e-05
Epoch 2/4
60000/60000 [==============================] - 39s 652us/sample - loss: 646.1003 - acc: 0.0000e+00 - l - ETA: 0s - loss: 646.0983 - acc: 0.0000e
Epoch 3/4
60000/60000 [==============================] - 35s 590us/sample - loss: 646.1003 - acc: 0.0000e+00
Epoch 4/4
60000/60000 [==============================] - 33s 544us/sample - loss: 646.1003 - acc: 0.0000e+00
```

您可以尝试使用sparse_categorical_crossentropy损失函数。 另外,您的批量大小是多少? 并且正如已经建议的那样,您可能希望增加时期数。

好的,我BatchNormalization之间添加了BatchNormalization ,并将损失函数更改为'sparse_categorical_crossentropy' 这就是我的NN的样子:

model = Sequential()

model.add(Flatten())
model.add(BatchNormalization(axis = 1, momentum = 0.99))
model.add(Dense(inputdim, activation = tfnn.relu))
model.add(BatchNormalization(axis = 1, momentum = 0.99))
model.add(Dense(128, activation = tfnn.relu))
model.add(BatchNormalization(axis = 1, momentum = 0.99))
model.add(Dense(10, activation = tfnn.softmax))

model.compile(loss = 'sparse_categorical_crossentropy', optimizer = 'adam', metrics = ['accuracy'])

那就是结果:

Epoch 1/4
60000/60000 [==============================] - 68s 1ms/sample - loss: 0.2045 - acc: 0.9374
Epoch 2/4
60000/60000 [==============================] - 55s 916us/sample - loss: 0.1007 - acc: 0.9689

谢谢你的帮助!

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM