简体   繁体   中英

Graph disconnected error when loading model in keras

I have a model that works and fit correctly. But if I save the model after training, when I try to load it, it throws this error:

ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 256, 256, 3), dtype=tf.float32, name='InputLucaSchifoso'), name='InputLucaSchifoso', description="created by layer 'InputLucaSchifoso'") at layer "conv2d_5LucaSchifoso". The following previous layers were accessed without issue: []

This is the creation of the model and its training that works whitout errors

# These models are loaded previously
model_dict = {
    "InceptionV3": model_InceptionV3,
    "LucaSchifoso": model_LucaSchifoso,
    "MobileNetV2": model_MobileNetV2, 
    "Resnet50": model_Resnet50
}

# Every layer's name must be unique
for model_name in model_dict.keys():
    for layer in model_dict[model_name].layers:
        layer._name += model_name

# Create Model
proc_layer_dict = {}

input_layer = tfk.layers.Input(shape=input_shape, name="input_layer")

layers_dict = {}

for model_name in preprocessing_function_dict:
    proc_layer_dict[model_name] = tfk.layers.Lambda(
        preprocessing_function_dict[model_name], name="lambda_" + model_name
    )(input_layer)
    
    layers_dict[model_name] = []
    layers_dict[model_name].append(proc_layer_dict[model_name])
    for layer in model_dict[model_name].layers:
        layers_dict[model_name].append(layer(layers_dict[model_name][-1]))

maxpool_LucaSchifoso1 = tfkl.MaxPooling2D(
        name='maxpool_LucaSchifoso1',
        pool_size = (3, 3)
    )(layers_dict["LucaSchifoso"][10])

flatten_LucaSchifoso1 = tfkl.Flatten(name='flatten_LucaSchifoso1')(maxpool_LucaSchifoso1)

concatenate_layer = tfkl.Concatenate()([layers_dict["InceptionV3"][2],
                                        layers_dict["MobileNetV2"][2],
                                        flatten_LucaSchifoso1, 
                                        layers_dict["Resnet50"][2]])

dropout_mergione1 = tfkl.Dropout(0.3, name='dropout_mergione1', seed=seed)(concatenate_layer)
dense_mergione1 = tfkl.Dense(units=512, name='dense_mergione1', kernel_initializer=tfk.initializers.GlorotUniform(seed), activation='relu')(dropout_mergione1)
dropout_mergione2 = tfkl.Dropout(0.3, name='dropout_mergione2', seed=seed)(dense_mergione1)
output_mergione = tfkl.Dense(name='output_mergione', units=14, activation='softmax', kernel_initializer=tfk.initializers.GlorotUniform(seed))(dropout_mergione2)

modellone = tfk.Model(inputs=input_layer, outputs=output_mergione, name='model')

modellone.compile(loss=tfk.losses.CategoricalCrossentropy(), optimizer=tfk.optimizers.Adam(), metrics='loss')

# Fit the Model
history = modellone.fit(
        x = train_gen,
        epochs = epochs,
        validation_data = valid_gen,
    ).history

# Save trained model
modellone.save("best")

The four loaded models are saved and loaded correctly when they are standalone, so I think that the problem is not there.

This is the row that throws the error:

# Load model
model = tf.keras.models.load_model('best')

This is the result of tfk.utils.plot_model(modellone)在此处输入图像描述

I apologize if the code is not enough to test the problem, but I don't know how to make it reproducible without add all the code. I hope you can help me anyway.

The problem was generated by the input layers inside the model, for some reason they don't create any problem during compiling and training of the model, but they do during loading.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM