I have a custom model that implements an RNN
, and I am trying to train the model using early stopping as a callback. In the custom layer and in the custom model I have implemented the get_config
method as found in some online examples, saving all the variables declared in the various classes.
EDIT
I changed the code I used a bit, I no longer have a custom model but a custom layer that uses another custom layer inside. The problem now is when I go to load the model I get
TypeError: ('Keyword argument not understood:', 'hidden')
I think it is related in some way to how I save the list of custom layers in the custom layer. How to solve? Also trying to remove "hidden": self.hidden
from get_config()
method the model is saved and loaded without any error, but when I go to evaluate the two models (the one saved and the one just trained) I get two completely different loss values
CustomModel
class MyCell(keras.layers.Layer):
def __init__(self, units,
scaling=1.,
**kwargs):
self.units = units
self.scaling = scaling
#other
super().__init__(**kwargs)
def build(self, input_shape):
# build input weight matrix and other
self.built = True
def call(self, inputs):
# computes the output of the cell
# ...
return output
def get_config(self):
base_config = super().get_config()
return {**base_config,
"units": self.units,
"scaling": self.scaling
}
class MyLayer(keras.layers.Layer):
def __init__(self, units=100, layers=5,scaling=1
**kwargs):
super().__init__(**kwargs)
self.layers = layers
self.units = units
self.hidden = []
for _ in range(layers):
self.hidden.append(keras.layers.RNN(MyCell(units=units,scaling=scaling)))
def call(self, inputs):
# compute the output of the deep net
def get_config(self):
base_config = super().get_config()
return {**base_config,
"layers": self.layers,
"units": self.units,
#here I save the list of custom cells
"hidden" : self.hidden
}
How I train the model and reload it later
model = keras.Sequential(
[
layers.Input(shape=(x_train.shape[1], x_train.shape[2])),
MyLayer(units=512, layers=1),
tf.keras.layers.Dense(1),
]
)
model.compile(optimizer=keras.optimizers.Adam(learning_rate=0.001), loss="mse")
callbacks = [
keras.callbacks.ModelCheckpoint(
"models/best_custom_model.h5", save_best_only=True, monitor="val_loss"
),
keras.callbacks.EarlyStopping(monitor="val_loss", patience=5, mode="min"),
]
history = model.fit(
x_train,
y_train,
epochs=50,
batch_size=128,
validation_split=0.1,
callbacks=callbacks
)
#here comes the problem
model1 = tf.keras.models.load_model('models/best_custom_model.h5',
custom_objects={"MyLayer":MyLayer})
model.evaluate(x_test, y_test) #model just trained loss 0.10 more or less
model1.evaluate(x_test, y_test) #model loaded loss 2.0 more or less
I have a custom model that implements an RNN
, and I am trying to train the model using early stopping as a callback. In the custom layer and in the custom model I have implemented the get_config
method as found in some online examples, saving all the variables declared in the various classes.
EDIT
I changed the code I used a bit, I no longer have a custom model but a custom layer that uses another custom layer inside. The problem now is when I go to load the model I get
TypeError: ('Keyword argument not understood:', 'hidden')
I think it is related in some way to how I save the list of custom layers in the custom layer. How to solve? Also trying to remove "hidden": self.hidden
from get_config()
method the model is saved and loaded without any error, but when I go to evaluate the two models (the one saved and the one just trained) I get two completely different loss values
CustomModel
class MyCell(keras.layers.Layer):
def __init__(self, units,
scaling=1.,
**kwargs):
self.units = units
self.scaling = scaling
#other
super().__init__(**kwargs)
def build(self, input_shape):
# build input weight matrix and other
self.built = True
def call(self, inputs):
# computes the output of the cell
# ...
return output
def get_config(self):
base_config = super().get_config()
return {**base_config,
"units": self.units,
"scaling": self.scaling
}
class MyLayer(keras.layers.Layer):
def __init__(self, units=100, layers=5,scaling=1
**kwargs):
super().__init__(**kwargs)
self.layers = layers
self.units = units
self.hidden = []
for _ in range(layers):
self.hidden.append(keras.layers.RNN(MyCell(units=units,scaling=scaling)))
def call(self, inputs):
# compute the output of the deep net
def get_config(self):
base_config = super().get_config()
return {**base_config,
"layers": self.layers,
"units": self.units,
#here I save the list of custom cells
"hidden" : self.hidden
}
How I train the model and reload it later
model = keras.Sequential(
[
layers.Input(shape=(x_train.shape[1], x_train.shape[2])),
MyLayer(units=512, layers=1),
tf.keras.layers.Dense(1),
]
)
model.compile(optimizer=keras.optimizers.Adam(learning_rate=0.001), loss="mse")
callbacks = [
keras.callbacks.ModelCheckpoint(
"models/best_custom_model.h5", save_best_only=True, monitor="val_loss"
),
keras.callbacks.EarlyStopping(monitor="val_loss", patience=5, mode="min"),
]
history = model.fit(
x_train,
y_train,
epochs=50,
batch_size=128,
validation_split=0.1,
callbacks=callbacks
)
#here comes the problem
model1 = tf.keras.models.load_model('models/best_custom_model.h5',
custom_objects={"MyLayer":MyLayer})
model.evaluate(x_test, y_test) #model just trained loss 0.10 more or less
model1.evaluate(x_test, y_test) #model loaded loss 2.0 more or less
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.