简体   繁体   中英

Deep Neural Network Not Learning Anything

I am training a simple Neural network with some Dense and Dropout Layers. But on running the fit function, there is no training taking place. My Model is:

import tensorflow as tf
from tensorflow.keras.layers import Dense, Dropout
from tensorflow.keras.models import Sequential

Model = Sequential()

Model.add(Dense(32, input_dim=12, activation='relu'))

for i in range(4):
    Model.add(Dense(2**(5+i), activation='relu'))
    Model.add(Dropout(0.5))

Model.add(Dense(1, activation='softmax'))

Model.summary()

Model.compile(
    optimizer=tf.keras.optimizers.Adam(learning_rate=0.1),
    loss=tf.keras.losses.BinaryCrossentropy(),
    metrics=[tf.keras.metrics.Accuracy()]
)
my_callbacks = [
    tf.keras.callbacks.EarlyStopping(patience=2),
]
Model.fit(x=X, y=Y, batch_size=32, epochs=20, verbose=1, validation_split=0.1, callbacks=my_callbacks)

Result of Training are:

Epoch 1/20
26/26 [==============================] - 0s 5ms/step - loss: 9.3856 - accuracy: 0.3845 - val_loss: 9.4884 - val_accuracy: 0.3778
Epoch 2/20
26/26 [==============================] - 0s 3ms/step - loss: 9.3856 - accuracy: 0.3845 - val_loss: 9.4884 - val_accuracy: 0.3778
Epoch 3/20
26/26 [==============================] - 0s 3ms/step - loss: 9.3856 - accuracy: 0.3845 - val_loss: 9.4884 - val_accuracy: 0.3778
<tensorflow.python.keras.callbacks.History at 0x7f94a623ffd0>

I tried to see what the Model is predicting:

Model.predict(X)[:10]

Results:

array([[1.],
       [1.],
       [1.],
       [1.],
       [1.],
       [1.],
       [1.],
       [1.],
       [1.],
       [1.]], dtype=float32)

and so on (all predictions are 1.0).

My input dataframes are:

X.head(5)
    0            1           2           3           4           5           6           7           8           9           10          11
0   -0.572351   -0.518084   0.919925    -0.743497   0.743497    -0.50977    -0.32204    0.655011    -0.611972   0.481288    -0.445  -0.503402
1   1.747178    -0.518084   -1.087045   1.344995    -1.344995   1.96167     -0.32204    -1.526692   0.630431    0.481288    -0.445  0.734222
2   -0.572351   -0.518084   0.919925    1.344995    -1.344995   -0.50977    -0.32204    0.655011    -0.301371   -0.479087   -0.445  -0.490356
3   1.747178    -0.518084   -1.087045   1.344995    -1.344995   -0.50977    -0.32204    0.655011    0.397481    0.481288    -0.445  0.382778
4   -0.572351   -0.518084   0.919925    -0.743497   0.743497    -0.50977    -0.32204    0.655011    0.397481    -0.479087   -0.445  -0.487940

Y.head(10):

0    0
1    1
2    1
3    1
4    0
5    0
6    0
7    0
8    1
9    1
Name: Survived, dtype: int64

I am not getting what is the mistake I am doing.

Use a sigmoid activation function if you only have 1 output neuron with 2 posibilities. If you want to use softmax, use 2 output neurons and one hot encode your answers like: [0,1] or [1,0]

This issue is explained here: https://mc.ai/softmax-output-neurons-number-for-binary-classification/

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM