[英]Access an intermediate layer in Keras after multi_gpu_model
As stated in this issue , Keras' multi_gpu_model
alters the apparent layer structure of a model.如本期所述,
multi_gpu_model
改变了模型的表观层结构。
For instance, when the layers from an original model looks are:例如,当原始模型的层看起来是:
>>> [l.name for l in my_model.layers]
[input, conv2d_1, conv2d_2, maxpool_1, conv2d_3, maxpool_2, dense_1, dense_2, dense_3]
After using multi_gpu_model(my_model)
, then the layers will become:使用
multi_gpu_model(my_model)
后,层将变为:
>>> new_model = multi_gpu_model(my_model)
>>> [l.name for l in new_model.layers]
[input, lambda_1, lambda_2, lambda_3, lambda_4, model_1, dense_3]
I am trying to access the output from the layer maxpool_2
from the original model, in order to use it into a different, subsequent model.我正在尝试访问来自原始模型的
maxpool_2
层的输出,以便将其用于不同的后续模型。 How can I access the original maxpool_2
layer's output after using multi_gpu_model
?使用
multi_gpu_model
后如何访问原始maxpool_2
层的输出?
Note 1: as old_model
has not been trained, I have no interest in getting the output from old_model.get_layer('maxpool_2')
.注 1:由于
old_model
尚未经过训练,我对从old_model.get_layer('maxpool_2')
获取输出没有兴趣。 Only the outputs from the trained new_model
are interesting here.这里只有经过训练的
new_model
的输出才有趣。
Note 2: trying to call new_model.get_layer('maxpool_2')
will trigger ValueError: No such layer: maxpool_2.注意 2:尝试调用
new_model.get_layer('maxpool_2')
将触发 ValueError: No such layer: maxpool_2。
From Keras source code it seems that the original model shares its weights with the multi-gpu model:从Keras 源代码看来,原始模型与多 GPU 模型共享权重:
Therefore suffice to do the following the access the intermediate layer:因此足以做以下访问中间层:
old_model.get_layer('maxpool_2').output
Which will use the output of the trained model.这将使用经过训练的模型的输出。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.