[英]How to insert dropout layers after activation layers in a pre-trained non-sequential model using functional keras API?
I am working on a modified resnet, and want to insert dropout after activation layers.我正在研究修改后的 resnet,并希望在激活层之后插入 dropout。 I have tried the following but due to the model not being sequential, it did not work:我尝试了以下方法,但由于 model 不是连续的,它不起作用:
def add_dropouts(model, probability = 0.5):
print("Adding Dropouts")
updated_model = tf.keras.models.Sequential()
for layer in model.layers:
print("layer = ", layer)
updated_model.add(layer)
if isinstance(layer, tf.keras.layers.Activation):
updated_model.add(tf.keras.layers.Dropout(probability))
print("updated model Summary = ", updated_model.summary)
print("model Summary = ", model.summary)
model = updated_model
return model
base_model = tf.keras.applications.ResNet50V2(include_top=False, input_shape=input_img_shape, pooling='avg')
base_model = add_dropouts(base_model, probability = 0.5)
Then i tried my own version using the functional API, but this method doesn't work and returns a value error say Tensor doesn't have output.然后我使用功能 API 尝试了我自己的版本,但是这种方法不起作用并返回一个值错误,说 Tensor 没有 output。
prev_layer = base_model.layers[0]
for layer in base_model.layers:
next_layer = layer(prev_layer.output)
if isinstance(layer, tf.keras.layers.Activation):
next_layer = Dropout(0.5)(next_layer.output)
prev_layer = next_layer
Does anyone know how someone would add dropout layers into resnet or any other pretrained network?有谁知道有人会如何将 dropout 层添加到 resnet 或任何其他预训练网络中?
So eventually i figured out how to do it;所以最终我想出了如何去做; but its very hacky.但它非常hacky。 Go to: Go 至:
C:\ProgramData\Anaconda3\envs*your env name*\Lib\site-packages\tensorflow\python\keras\applications C:\ProgramData\Anaconda3\envs*你的环境名称*\Lib\site-packages\tensorflow\python\keras\applications
Go to resnet.py. Go 到 resnet.py。 This will also change resnetv2 instances because it is based on the original resnet.这也将更改 resnetv2 实例,因为它基于原始 resnet。 Just Cntrl+F for activation,and where you see an activation layer(which is usually in the format x = Layer(x) building the model a layer at a time) then just add: x = Dropout(prob)(x) Here is an example:只需 Cntrl+F 进行激活,您会在其中看到激活层(通常格式为 x = Layer(x),一次构建 model 一层)然后只需添加:x = Dropout(prob)(x) 在这里是一个例子:
if not preact:
x = layers.BatchNormalization(
axis=bn_axis, epsilon=1.001e-5, name='conv1_bn')(x)
x = layers.Activation('relu', name='conv1_relu')(x)#insert layer after each of these
x = layers.Dropout(prob)(x) # added dropout
Do this for all similar search results for 'activation'.对“激活”的所有类似搜索结果执行此操作。
Then you will see the dropout added in your model summary.然后你会看到你的 model 摘要中添加了 dropout。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.