简体   繁体   中英

Load weights for last layer (output layer) to a new model from trained network

Is it possible to load the weights to the last layer in my new model from trained.network by using set_weights and get_weights scheme? The point is, i saved the weight of each layer as a mat file (after training) to make some calculation in Matlab and i want just the modified weights of the last layer to be loaded to the last layer in my new model and other layers get the same weights as the trained model. It is a bit trickey, since the saved format is mat.

weights1 = lstm_model1.layers[0].get_weights()[0]
biases1 = lstm_model1.layers[0].get_weights()[1]
weights2 = lstm_model1.layers[2].get_weights()[0]
biases2 = lstm_model1.layers[2].get_weights()[1]
weights3 = lstm_model1.layers[4].get_weights()[0]
biases3 = lstm_model1.layers[4].get_weights()[1]
# Save the weights and biases for adaptation algorithm 
savemat("weights1.mat", mdict={'weights1': weights1})  
savemat("biases1.mat", mdict={'biases1': biases1})      
savemat("weights2.mat", mdict={'weights2': weights2})   
savemat("biases2.mat", mdict={'biases2': biases2})      
savemat("weights3.mat", mdict={'weights3': weights3}) 
savemat("biases3.mat", mdict={'biases3': biases3})  

How can i load just the old weights of other layers to the new model (without the last layer) and the modified weights of last layer to the last layer in the new one?

If it was saved as a.h5 file format, this works. However, I'm not sure about.mat:

In simplicity, you just have to call get_weights on the desired layer, and similarly, set_weights on the corresponding layer of the other model:

last_layer_weights = old_model.layers[-1].get_weights()
new_model.layers[-1].set_weights(last_layer_weights)

For a more complete code sample, here you go:

# Create an arbitrary model with some weights, for example
model = Sequential(layers = [
    Dense(70, input_shape = (100,)),
    Dense(60),
    Dense(50),
    Dense(5)])

# Save the weights of the model
model.save_weights(“model.h5”)

# Later, load in the model (we only really need the layer in question)
old_model = Sequential(layers = [
    Dense(70, input_shape = (100,)),
    Dense(60),
    Dense(50),
    Dense(5)])

old_model.load_weights(“model.h5”)

# Create a new model with slightly different architecture (except for the layer in question, at least)
new_model = Sequential(layers = [
    Dense(80, input_shape = (100,)),
    Dense(60),
    Dense(50),
    Dense(5)])

# Set the weights of the final layer of the new model to the weights of the final layer of the old model, but leaving other layers unchanged.
new_model.layers[-1].set_weights(old_model.layers[-1].get_weights())

# Assert that the weights of the final layer is the same, but other are not.
print (np.all(new_model.layers[-1].get_weights()[0] == old_model.layers[-1].get_weights()[0]))
>> True

print (np.all(new_model.layers[-2].get_weights()[0] == old_model.layers[-2].get_weights()[0]))
>> False

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM