简体   繁体   中英

keras combine pretrained model

I trained a single model and want to combine it with another keras model using the functional api (backend is tensorflow version 1.4)

My first model looks like this:

import tensorflow.contrib.keras.api.keras as keras

model = keras.models.Sequential()
input = Input(shape=(200,))
dnn = Dense(400, activation="relu")(input)
dnn = Dense(400, activation="relu")(dnn)
output = Dense(5, activation="softmax")(dnn)
model = keras.models.Model(inputs=input, outputs=output)

after I trained this model I save it using the keras model.save() method. I can also load the model and retrain it without problems.

Now I want to use the output of this model as additional input for a second model:

# load first model
old_model = keras.models.load_model(path_to_old_model)

input_1 = Input(shape=(200,))
input_2 = Input(shape=(200,))
output_old_model = old_model(input_2)

merge_layer = concatenate([input_1, output_old_model])
dnn_layer = Dense(200, activation="relu")(merge_layer)
dnn_layer = Dense(200, activation="relu")(dnn_layer)
output = Dense(10, activation="sigmoid")(dnn_layer)
new_model = keras.models.Model(inputs=[input_1, input_2], outputs=output)
new_model.compile(loss="binary_crossentropy", optimizer="adam", metrics=["accuracy"]
new_model.fit(inputs=[x1,x2], labels=labels, epochs=50, batch_size=32)

when I try this I get the following error message:

FailedPreconditionError (see above for traceback): Attempting to use uninitialized value dense_1/kernel
 [[Node: dense_1/kernel/read = Identity[T=DT_FLOAT, _class=["loc:@dense_1/kernel"], _device="/job:localhost/replica:0/task:0/device:GPU:0"](dense_1/kernel)]]
 [[Node: model_1_1/dense_3/BiasAdd/_79 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_68_model_1_1/dense_3/BiasAdd", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]

I would do this in following steps:

  1. Define function for building a clean model with the same architecture:

     def build_base(): input = Input(shape=(200,)) dnn = Dense(400, activation="relu")(input) dnn = Dense(400, activation="relu")(dnn) output = Dense(5, activation="softmax")(dnn) model = keras.models.Model(inputs=input, outputs=output) return input, output, model
  2. Build two copies of the same model:

     input_1, output_1, model_1 = build_base() input_2, output_2, model_2 = build_base()
  3. Set weights in both models:

     model_1.set_weights(old_model.get_weights()) model_2.set_weights(old_model.get_weights())
  4. Now do the rest:

     merge_layer = concatenate([input_1, output_2]) dnn_layer = Dense(200, activation="relu")(merge_layer) dnn_layer = Dense(200, activation="relu")(dnn_layer) output = Dense(10, activation="sigmoid")(dnn_layer) new_model = keras.models.Model(inputs=[input_1, input_2], outputs=output)

Let's say you have a pre-trained/saved CNN model called pretrained_model and you want to add a densely connected layers to it, then using the functional API you can write something like this:

from keras import models, layers

kmodel = layers.Flatten()(pretrained_model.output)
kmodel = layers.Dense(256, activation='relu')(kmodel)
kmodel_out = layers.Dense(1, activation='sigmoid')(kmodel)
model = models.Model(pretrained_model.input, kmodel_out)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM