繁体   English   中英

如何组合从两个 cnn 模型中提取的特征?

[英]How to combine features extracted from two cnn models?

我有两个 cnn 模型都遵循相同的架构。 我在cnn1和'train set 2上训练了'train set 1'; 在 cnn2 上。然后我使用以下代码提取了功能。

#cnn1

    model.pop() #removes softmax layer
    model.pop() #removes dropoutlayer
    model.pop() #removes activation layer
    model.pop() #removes batch-norm layer
    model.build() #here lies dense 512
    features1 = model.predict(train set 1)
    print(features1.shape) #600,512

#cnn2

    model.pop() #removes softmax layer
    model.pop() #removes dropoutlayer
    model.pop() #removes activation layer
    model.pop() #removes batch-norm layer
    model.build() #here lies dense 512
    features2 = model.predict(train set 2)
    print(features2.shape) #600,512

如何将这些特征 1 和特征 2 结合起来,使 output 形状为 600,1024?

最简单的解决方案:

您可以通过这种方式简单地连接两个网络的 output:

features = np.concatenate([features1, features2], 1)

选择:

给定两个具有相同结构的训练模型,无论它们的结构是什么,您都可以通过这种方式组合它们

# generate dummy data
n_sample = 600
set1 = np.random.uniform(0,1, (n_sample,30))
set2 = np.random.uniform(0,1, (n_sample,30))

# model 1
inp1 = Input((30,))
x1 = Dense(512,)(inp1)
x1 = Dropout(0.3)(x1)
x1 = BatchNormalization()(x1)
out1 = Dense(3, activation='softmax')(x1)
m1 = Model(inp1, out1)
# m1.fit(...)

# model 2
inp2 = Input((30,))
x2 = Dense(512,)(inp2)
x2 = Dropout(0.3)(x2)
x2 = BatchNormalization()(x2)
out2 = Dense(3, activation='softmax')(x2)
m2 = Model(inp2, out2)
# m2.fit(...)

# concatenate the desired output
concat = Concatenate()([m1.layers[1].output, m2.layers[1].output]) # get the outputs of dense 512 layers
merge = Model([m1.input, m2.input], concat)

# make combined predictions
merge.predict([set1,set2]).shape  # (n_sample, 1024)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM