简体   繁体   English

从 EfficientNet Tensorflow 中提取特征

[英]Extracting features from EfficientNet Tensorflow

I have a CNN model trained using EfficientNetB6.我有一个使用 EfficientNetB6 训练的 CNN model。 My task is to extract the features of this trained model by removing the last dense layer and then using those weights to train a boosting model.我的任务是通过移除最后一个密集层,然后使用这些权重来训练一个提升 model,来提取这个经过训练的 model 的特征。 i did this using Pytorch earlier and was able to extract the weights from the layers i was interested and predicted on my validation set and then boosted.我之前使用 Pytorch 做到了这一点,并且能够从我感兴趣的层中提取权重,并在我的验证集上进行预测,然后进行提升。

I am doing this now in tensorflow but currently stuck.我现在在 tensorflow 中这样做,但目前卡住了。 Below is my model structure and I have tried using the code on the website but did not had any luck.下面是我的 model 结构,我尝试使用网站上的代码,但没有任何运气。 在此处输入图像描述

I want to remove the last dense layer and predict on the validation set using the remaining layers.我想删除最后一个密集层并使用剩余层预测验证集。

I tried using:我尝试使用:

layer_name = 'efficientnet-b6' intermediate_layer_model = tf.keras.Model(inputs = model.input, outputs = model.get_layer(layer_name).output) layer_name = 'efficientnet-b6' intermediate_layer_model = tf.keras.Model(输入 = model.input,输出 = model)。

but i get an error " ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_1:0", shape=(None, 760, 760, 3), dtype=float32) at layer "input_1". The following previous layers were accessed without issue: []"但我收到错误“ValueError: Graph disconnected: cannot get value for tensor Tensor("input_1:0", shape=(None, 760, 760, 3), dtype=float32) at layer "input_1". 以下之前的图层访问没有问题:[]”

Any way to resolve this?有什么办法可以解决这个问题?

Sorry my bad.对不起这是我的错。 I simply added a GlobalAveragePooling2D layer after the efficientnet layer and i am able to extract the weights and continue:)我只是在高效网络层之后添加了一个 GlobalAveragePooling2D 层,我能够提取权重并继续:)

just for reference:仅供参考:

def build_model(dim=CFG['net_size'], ef=0):
    inp = tf.keras.layers.Input(shape=(dim,dim,3))
    base = EFNS[ef](input_shape=(dim,dim,3),weights='imagenet',include_top=False)
    x = base(inp)
    x = tf.keras.layers.GlobalAveragePooling2D()(x)
    x = tf.keras.layers.Dense(1,activation='sigmoid')(x)
    model = tf.keras.Model(inputs=inp,outputs=x)
    opt = tf.keras.optimizers.Adam(learning_rate=0.001)
    loss = tf.keras.losses.BinaryCrossentropy(label_smoothing=0.05) 
    model.compile(optimizer=CFG['optimizer'],loss=loss,metrics=[tf.keras.metrics.AUC(name='auc')])
    print(model.summary())
    return model

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM