简体   繁体   中英

Keep training Keras model with loading and saving the weights

Since I cannot install h5py due to package inconsistency I am wondering if it is possible to save and load the weights in Keras to keep training your model on a new data. I know I can do the following:

   old_weights = model.get_weights()
   del model
   new_model.set_weights(old_weights)

where model is the old model and new_model is the new one.Here is a complete example:

for i in training data:
    model = Sequential()
    model.add(Dense(20, activation='tanh', input_dim=Input))
    model.add(Dense(1))
    model.compile(optimizer='adam', loss='mse')
    model.fit(X, y, epochs=8, batch_size=16, shuffle=False, verbose=0)
    new_model = Sequential()
    new_model.add(Dense(20, activation='tanh', input_dim=Input))
    new_model.add(Dense(1))
    new_model.compile(optimizer='adam', loss='mse')
    old_weights = model.get_weights()
    del model
    new_model.set_weights(old_weights)
    model=new_model

I want after reading each training example (X and y are different at each iteration) save the weights and load it again and start from pre-trained model. I am not sure if my code does that since I am defining optimizer and model.compile again. Can anyone help me if the following code save the model and every iteration starts from pre-trained model.

You don't need to keep recompiling the model. Instead just fit your model multiple times after loading your samples.

from keras.models import Sequential
from keras.layers import Dense

model = Sequential()
model.add(Dense(20, activation='tanh', input_dim=Input))
model.add(Dense(1))
model.compile(optimizer='adam', loss='mse')
# load the data into training_data 
for data in training_data:  
    model.fit(data[0], data[1], epochs=8, batch_size=16, shuffle=False, verbose=0)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM