简体   繁体   中英

How to remove the last layer from a pre-trained model. I have tried model.layers.pop() but it is not working

I am trying to remove the last layer so that I can use transfer Leaning.

vgg16_model = keras.applications.vgg16.VGG16()
model = Sequential()

for layer in vgg16_model.layers:
    model.add(layer)

model.layers.pop()


# Freeze the layers 
for layer in model.layers:
    layer.trainable = False


# Add 'softmax' instead of earlier 'prediction' layer.
model.add(Dense(2, activation='softmax'))


# Check the summary, and yes new layer has been added. 
model.summary()

But the output I am getting is not what I expected. It is still showing the last layer of vgg16 model.

Here is the output

    _________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
block1_conv1 (Conv2D)        (None, 224, 224, 64)      1792      
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 224, 224, 64)      36928       

**THE HIDDEN LAYERS** 
_________________________________________________________________
fc1 (Dense)                  (None, 4096)              102764544 
_________________________________________________________________
fc2 (Dense)                  (None, 4096)              16781312  
_________________________________________________________________
predictions (Dense)          (None, 1000)              4097000   
_________________________________________________________________
dense_10 (Dense)             (None, 2)                 2002      
=================================================================
Total params: 138,359,546
Trainable params: 2,002
Non-trainable params: 138,357,544

Note - In the output I have not shown the whole model, just showed the first few layers and the last layers.

How should I remove the last layer to do transfer learning??

PS Keras version = 2.2.4

Just do not add the last layer to your model in the first place. This way you won't even need pop

vgg16_model = keras.applications.vgg16.VGG16()
model = Sequential()

for layer in vgg16_model.layers[:-1]: # this is where I changed your code
    model.add(layer)    

# Freeze the layers 
for layer in model.layers:
    layer.trainable = False

# Add 'softmax' instead of earlier 'prediction' layer.
model.add(Dense(2, activation='softmax'))

Alternatively to markuscosinus answer, you can take the output before the prediction layer and pass it through your own prediction layer. You can do it as follows:

for layer in vgg16_model.layers: 
    layer.trainable = False
last_layer = vgg16_model.get_layer('fc2').output
out = Flatten()(last_layer)
out = Dense(128, activation='relu', name='fc3')(out)
out = Dropout(0.5)(out)
out = Dense(n_classes, activation='softmax', name='prediction')(out)
vgg16_custom_model = Model(input=vgg16_model.input, output=out)

I suggest you to add a Flatten and another Dense layer before your softmax because the last one "fc2" have 4096 nodes and it's hard to change it to 2.

And of course, dropout before the prediction will give you better resoults.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM