简体   繁体   中英

Keras extremely high loss, not decreasing with each epoch

I'm using Keras to build and train a recurrent neural network.

from keras.models import Sequential
from keras.layers.core import Dense, Activation, Masking
from keras.layers.recurrent import LSTM

#build and train model
in_dimension = 3
hidden_neurons = 300
out_dimension = 2

model = Sequential()
model.add(Masking([0,0,0], input_shape=(max_sequence_length, in_dimension)))
model.add(LSTM(hidden_neurons, return_sequences=True, input_shape=(max_sequence_length, in_dimension)))
model.add(LSTM(hidden_neurons, return_sequences=False))
model.add(Dense(out_dimension))
model.add(Activation('softmax'))

model.compile(loss="categorical_crossentropy", optimizer="rmsprop")
model.fit(padded_training_seqs, training_final_steps, nb_epoch=5, batch_size=1)

padded_training_seqs is an an array of sequences of [latitude, longitude, temperature], all padded to the same length with values of [0,0,0]. When I train this network, the first epoch gives me a loss of about 63, and increases after more epochs. This is causing a model.predict call later in the code to give values that are completely off of the training values. For example, most of the training values in each sequence is around [40, 40, 20] , but the RNN outputs values consistently around [0.4, 0.5] , which causes me to think something is wrong with the masking layer.

The training X ( padded_training_seqs ) data looks something like this (only much larger):

[
[[43.103, 27.092, 19.078], [43.496, 26.746, 19.198], [43.487, 27.363, 19.092], [44.107, 27.779, 18.487], [44.529, 27.888, 17.768]], 
[[44.538, 27.901, 17.756], [44.663, 28.073, 17.524], [44.623, 27.83, 17.401], [44.68, 28.034, 17.601], [0,0,0]],
[[47.236, 31.43, 13.905], [47.378, 31.148, 13.562], [0,0,0], [0,0,0], [0,0,0]]
]

and the training Y ( training_final_steps ) data looks like this:

[
[44.652, 39.649], [37.362, 54.106], [37.115, 57.66501]
]

I am somewhat certain that you're misusing the Masking layer from Keras. Check the documentation here for more details.

Try using a masking layer like:

model.add(Masking(0, input_shape=(max_sequence_length, in_dimension)))

because I believe it just needs the masking value in the timestep dimension, not the entire time-dimension and value (ie [0,0,0]).

Best of luck.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM