简体   繁体   中英

How do I get weights and biases from my model?

I have a simple neural network, I need to get weights and biases from the model. I have tried a few approaches discussed before but I keep getting the out of bounds value error. Not sure how to fix this, or what I'm missing.

Network-

model = keras.Sequential([
    keras.layers.Flatten(input_shape=(28, 28)),
    keras.layers.Dense(128, activation=tf.nn.relu),
    keras.layers.Dense(10, activation=tf.nn.softmax)
])

model.layers[0].get_weights()[1]

Error - IndexError: list index out of range

This is what has been mentioned in a few questions,but I end up getting the out of bounds error for this.

I have another question, the index followed after model.layers[] , does it correspond to the layer? For instance model.layers[1] gives the weights corresponding to the second layer, something like that?

I've been there, I have been looking at my old code to see if I could remember how did I solved that issue. What I did was to print the length of the model.layer[index].get_weights()[X] to figure out where keras was saving the weights I needed. In my old code, model.layers[0].get_weights()[1] would return the biases, while model.layers[0].get_weights()[0] would return the actual weights. In any case, take into account that there are layers which weights aren't saved (as they don't have weights), so if asking for model.layers[0].get_weights()[0] doesn't work, try with model.layers[1].get_weights()[1] , as I'm not sure about flatten layers, but I do know that dense layers should save their weights.

The first layer (index 0) in your model is a Flatten layer, which does not have any weights, that's why you get errors.

To get the Dense layer, which is the second layer, you have to use index 1:

model.layers[1].get_weights()[1]

只需model.get_weights() ,您将获得模型的所有权重和偏差

To get the weights and bias on a Keras sequential and for every iteration, you can do it as in the next example:

# create model
model = Sequential()
model.add(Dense(numHiddenNeurons, activation="tanh", input_dim=4, kernel_initializer="uniform"))
model.add(Dense(1, activation="linear", kernel_initializer="uniform"))
# Compile model
model.compile(loss='mse', optimizer='adam', metrics=['accuracy', 'mse', 'mae', 'mape'])
weightsBiasDict = {}
    
weightAndBiasCallback = tf.keras.callbacks.LambdaCallback \
            (on_epoch_end=lambda epoch, logs: weightsBiasDict.update({epoch:model.get_weights()}))
    
# Fit the model
history= model.fit(X1, Y1, epochs=numIterations, batch_size=batch_size,  verbose=0, callbacks=weightAndBiasCallback)

weights and bias are accessible for every iteration on the dictionary weightsBiasDict

If you just need weights and bias values at the end of the training you can use model.layer[index].get_weights()[0] for weights and model.layer[index].get_weights()[1] for biases where index is the layer number on your network, starting at zero for the input layer.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM