简体   繁体   English

将输入输入到 tensorflow.keras model 的中间层

[英]Feed input to intermediate layer of tensorflow.keras model

I am trying to implement a hydranet architecture using tensorflow.keras.applications EfficientNetB0.我正在尝试使用 tensorflow.keras.applications EfficientNetB0 来实现 hydranet 架构。 The goal of this architecture is to split the network into two parts (first part: backbone, second part: head).该架构的目标是将网络分成两部分(第一部分:主干,第二部分:头部)。 An input image should then only be feed once to the backbone and it's output should be stored.然后输入图像应该只被馈送到主干,它的 output 应该被存储。 Afterwards this output should be feed directly to the heads (can be more than one depending on the number of classes to classify).之后,这个 output 应该直接送入磁头(根据分类的数量可以多于一个)。 Optimal approach:最佳方法:

  1. I don't want to redraw the entire model for every head.我不想为每个人头重新绘制整个 model。
  2. The backbone should only be executed once.主干应该只执行一次。

If already check this forum post: keras-give-input-to-intermediate-layer-and-get-final-output but the presented solutions either require re-coding of the head or don't work.如果已经查看此论坛帖子: keras-give-input-to-intermediate-layer-and-get-final-output ,但提出的解决方案要么需要重新编码头部,要么不起作用。

I've tried the following:我尝试了以下方法:

from tensorflow.keras.applications import EfficientNetB0 as Net
from tensorflow.keras.models import Model
split_idx = 73
input_shape = (250, 250, 3) # use depth=3 because imagenet is trained on RGB images
model = Net(weights="imagenet", include_top = True)

# Approach 1:

# create the full network so we can train on it
model_backbone = keras.models.Model(inputs=model.input, outputs=model.layers[split_idx].output)

# create new model taking the output from backbone as input and creating final output of head
model_head = keras.models.Model(inputs=model.layers[split_idx].output, 
                                outputs=model.layers[-1].output)

# Approach 2:
# create function for feeding input through backbone
# the function takes a normal input image as input and returns the output of the final backbone layer
create_backbone_output = K.function([model.layers[0].input], model.layers[split_idx].output)

# create function for feeding output of backbone through heads
create_heads_output = K.function([model.layers[split_idx].output], 
                                  model.output)

But when I try to execute this, I get a "graph disconnected error" for both approaches:但是,当我尝试执行此操作时,两种方法都会出现“图形断开错误”:

WARNING:tensorflow:Functional model inputs must come from `tf.keras.Input` (thus holding past layer 
metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input 
to "model_5" was not an Input tensor, it was generated by layer block3b_drop.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: block3b_drop/Identity:0
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-33-64dd6f6430a1> in <module>
  6 # create function for feeding output of backbone through heads
  7 create_heads_output = K.function([model.layers[split_idx].output], 
----> 8                                  model.output)

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\backend.py in 
function(inputs, 
outputs, updates, name, **kwargs)
4067     from tensorflow.python.keras import models  # pylint: disable=g-import-not-at-top
4068     from tensorflow.python.keras.utils import tf_utils  # pylint: disable=g-import-not-at-top
-> 4069     model = models.Model(inputs=inputs, outputs=outputs)
4070 
4071     wrap_outputs = isinstance(outputs, list) and len(outputs) == 1

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\training\tracking\base.py in 
_method_wrapper(self, *args, **kwargs)
515     self._self_setattr_tracking = False  # pylint: disable=protected-access
516     try:
--> 517       result = method(self, *args, **kwargs)
518     finally:
519       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\functional.py in 
__init__(self, inputs, outputs, name, trainable, **kwargs)
118     generic_utils.validate_kwargs(kwargs, {})
119     super(Functional, self).__init__(name=name, trainable=trainable)
--> 120     self._init_graph_network(inputs, outputs)
121 
122   @trackable.no_automatic_dependency_tracking

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\training\tracking\base.py in 
_method_wrapper(self, *args, **kwargs)
515     self._self_setattr_tracking = False  # pylint: disable=protected-access
516     try:
--> 517       result = method(self, *args, **kwargs)
518     finally:
519       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\functional.py in 
_init_graph_network(self, inputs, outputs)
202     # Keep track of the network's nodes and layers.
203     nodes, nodes_by_depth, layers, _ = _map_graph_network(
--> 204         self.inputs, self.outputs)
205     self._network_nodes = nodes
206     self._nodes_by_depth = nodes_by_depth

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\functional.py in 
_map_graph_network(inputs, outputs)
981                              'The following previous layers '
982                              'were accessed without issue: ' +
--> 983                              str(layers_with_complete_input))
984         for x in nest.flatten(node.outputs):
985           computable_tensors.add(id(x))

ValueError: Graph disconnected: cannot obtain value for tensor 
KerasTensor(type_spec=TensorSpec(shape=(None, 224, 224, 3), dtype=tf.float32, name='input_4'), 
name='input_4', description="created by layer 'input_4'") at layer "rescaling_3". The following 
previous layers were accessed without issue: []

I know that the error origins the problem that the provided tensor is not an input tensor.我知道错误源于提供的张量不是输入张量的问题。 Is there any solution to that problem?这个问题有什么解决办法吗?

1.) This model is not as sequential as you try to handle it -> split_idx +1 is an add_operation that adds another layer, this must be added to the first output and added to the second input. 1.) 这个 model 不像您尝试处理它那样连续 -> split_idx +1 是添加另一层的 add_operation,这必须添加到第一个 output 并添加到第二个输入。

block3b_drop (Dropout)          (None, 28, 28, 40)   0           block3b_project_bn[0][0]         
__________________________________________________________________________________________________
block3b_add (Add)               (None, 28, 28, 40)   0           block3b_drop[0][0]               
                                                                 block3a_project_bn[0][0]         
__________________________________________________________________________________________________

2.) Add all needed inputs with given outputs: 2.)将所有需要的输入与给定的输出相加:

second_input1 = keras.Input(shape=model.layers[split_idx].output.shape[1:])
second_input2 = keras.Input(shape=model.get_layer(name='block3a_project_bn').output.shape[1:])

3.) rewire the rest of the model here you need to add some things, but i give you some snippets to get you started: 3.) 重新连接 model 的 rest 在这里你需要添加一些东西,但我给你一些片段让你开始:

for sequentially rewiring it it would be:
    tmp = [second_input1,second_input2]
    for l in range(split_idx+1, len(model.layers)):
        layer = model.layers[l]
        print(layer.name, layer.input)
        tmp = layer(tmp)

In your case this is not enough, you need to find the correct inputs, the following snipped does that.在您的情况下,这还不够,您需要找到正确的输入,下面的片段就是这样做的。 Find the correct inputs, require it to the next output (keep track of the outputs), and work your way thru the graph找到正确的输入,要求它到下一个 output(跟踪输出),然后按照图表的方式工作

for l in model.layers:
    # multiple inputs
    if type(l.input) is list:
        for li,lv in enumerate(l.input):
            print('o ', li, lv.name)
    else:
        print('- ', l.input.name)

Another, cheap way, would be -> save it as json, add your input-node, remove unused nodes there.另一种便宜的方法是 -> 将其保存为 json,添加您的输入节点,在那里删除未使用的节点。 Load the new json file, in that case you do not need to rewire.加载新的 json 文件,在这种情况下您不需要重新接线。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Keras模型到tensorflow.keras - Keras model to tensorflow.keras 在训练TensorFlow模型(,?不是Keras模型)时,如何获取模型中间层(op)的输入输出? - During training the TensorFlow model(!!Not the Keras model), How to get the input and output of the intermediate layer(op) of the model? 将 keras model 输入馈送到 output 层 - Feed keras model input to the output layer 如何将 tensorflow.keras 模型移动到 GPU - How to move a tensorflow.keras model to GPU Tensorflow.Keras 说输入与预期输入不同 - Tensorflow.Keras says that the input is different than the expected input Keras带Tensorflow中间层的分批萃取 - Keras w/ Tensorflow intermediate layer extraction in batches 如何将 Conv2d 层 output 作为 Keras model 的输入? - How to feed a Conv2d layer output as input for a Keras model? 如何在 tensorflow.keras 模型指标中使用 sklearn AUC? - how to use sklearn AUC in tensorflow.keras model metrics? 如何将自定义数据生成器输入到 model.fit 中,它会生成 X,y 和一个额外的数组,到 tensorflow.keras Z20F4ZF011622Z Z20F4ZF35E630DAF39466 - How to input custom data generator into model.fit, which generates X,y and one additional array, into tensorflow.keras model? Tensorflow.keras:输入的形状是(),即使形状是(768、8) - Tensorflow.keras: Shape of input is (), EVEN THOUGH SHAPE IS (768, 8)
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM