简体   繁体   English

keras 在预训练模型上设置可训练标志

[英]keras setting trainable flag on pretrained model

Suppose I have a model假设我有一个模型

from tensorflow.keras.applications import DenseNet201

base_model = DenseNet201(input_tensor=Input(shape=basic_shape))

model = Sequential()
model.add(base_model)

model.add(Dense(400))
model.add(BatchNormalization())
model.add(ReLU())

model.add(Dense(50, activation='softmax'))

model.save('test.hdf5')

Then I load the saved model and try to make the last 40 layers of DenseNet201 trainable and the first 161 - non-trainable:然后我加载保存的模型并尝试使DenseNet201的最后 40 层可训练,前 161 层不可训练:

saved_model = load_model('test.hdf5')
cnt = 44
saved_model.trainable = False
  while cnt > 0:
      saved_model.layers[-cnt].trainable = True
      cnt -= 1

But this is not actually working because DenseNet201 is determined as a single layer and I just get index out of range error.但这实际上不起作用,因为DenseNet201被确定为单层,而我只是得到索引超出范围错误。

Layer (type)                 Output Shape              Param #   
=================================================================
densenet201 (Functional)     (None, 1000)              20242984  
_________________________________________________________________
dense (Dense)                (None, 400)               400400    
_________________________________________________________________
batch_normalization (BatchNo (None, 400)               1600      
_________________________________________________________________
re_lu (ReLU)                 (None, 400)               0         
_________________________________________________________________
dense_1 (Dense)              (None, 50)                20050     
=================================================================
Total params: 20,665,034
Trainable params: 4,490,090
Non-trainable params: 16,174,944

The question is how can I actually make the first 161 layers of DenseNet non-trainable and the last 40 layers trainable on a loaded model?问题是我如何才能真正使 DenseNet 的前 161 层不可训练,而后 40 层可在加载的模型上训练?

densenet201 (Functional) is a nested model, therefore you can access its layers the same way you access the layers of your 'topmost' model. densenet201 (Functional)是一个嵌套模型,因此您可以像访问“最顶层”模型的层一样访问它的层。

saved_model.layers[0].layers

where saved_model.layers[0] is a model with its own layers.其中saved_model.layers[0]是一个有自己层的模型。

In your loop, you need to access the layers like this在您的循环中,您需要像这样访问图层

saved_model.layers[0].layers[-cnt].trainable = True

Update更新

By default, the loaded model's layers are trainable ( trainable=True ), therefore you will need to set the bottom layers' trainable attribute to False instead.默认情况下,加载模型的层是可训练的( trainable=True ),因此您需要将底层的trainable属性设置为False

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM