简体   繁体   中英

Feed input to intermediate layer of tensorflow.keras model

I am trying to implement a hydranet architecture using tensorflow.keras.applications EfficientNetB0. The goal of this architecture is to split the network into two parts (first part: backbone, second part: head). An input image should then only be feed once to the backbone and it's output should be stored. Afterwards this output should be feed directly to the heads (can be more than one depending on the number of classes to classify). Optimal approach:

  1. I don't want to redraw the entire model for every head.
  2. The backbone should only be executed once.

If already check this forum post: keras-give-input-to-intermediate-layer-and-get-final-output but the presented solutions either require re-coding of the head or don't work.

I've tried the following:

from tensorflow.keras.applications import EfficientNetB0 as Net
from tensorflow.keras.models import Model
split_idx = 73
input_shape = (250, 250, 3) # use depth=3 because imagenet is trained on RGB images
model = Net(weights="imagenet", include_top = True)

# Approach 1:

# create the full network so we can train on it
model_backbone = keras.models.Model(inputs=model.input, outputs=model.layers[split_idx].output)

# create new model taking the output from backbone as input and creating final output of head
model_head = keras.models.Model(inputs=model.layers[split_idx].output, 
                                outputs=model.layers[-1].output)

# Approach 2:
# create function for feeding input through backbone
# the function takes a normal input image as input and returns the output of the final backbone layer
create_backbone_output = K.function([model.layers[0].input], model.layers[split_idx].output)

# create function for feeding output of backbone through heads
create_heads_output = K.function([model.layers[split_idx].output], 
                                  model.output)

But when I try to execute this, I get a "graph disconnected error" for both approaches:

WARNING:tensorflow:Functional model inputs must come from `tf.keras.Input` (thus holding past layer 
metadata), they cannot be the output of a previous non-Input layer. Here, a tensor specified as input 
to "model_5" was not an Input tensor, it was generated by layer block3b_drop.
Note that input tensors are instantiated via `tensor = tf.keras.Input(shape)`.
The tensor that caused the issue was: block3b_drop/Identity:0
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-33-64dd6f6430a1> in <module>
  6 # create function for feeding output of backbone through heads
  7 create_heads_output = K.function([model.layers[split_idx].output], 
----> 8                                  model.output)

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\backend.py in 
function(inputs, 
outputs, updates, name, **kwargs)
4067     from tensorflow.python.keras import models  # pylint: disable=g-import-not-at-top
4068     from tensorflow.python.keras.utils import tf_utils  # pylint: disable=g-import-not-at-top
-> 4069     model = models.Model(inputs=inputs, outputs=outputs)
4070 
4071     wrap_outputs = isinstance(outputs, list) and len(outputs) == 1

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\training\tracking\base.py in 
_method_wrapper(self, *args, **kwargs)
515     self._self_setattr_tracking = False  # pylint: disable=protected-access
516     try:
--> 517       result = method(self, *args, **kwargs)
518     finally:
519       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\functional.py in 
__init__(self, inputs, outputs, name, trainable, **kwargs)
118     generic_utils.validate_kwargs(kwargs, {})
119     super(Functional, self).__init__(name=name, trainable=trainable)
--> 120     self._init_graph_network(inputs, outputs)
121 
122   @trackable.no_automatic_dependency_tracking

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\training\tracking\base.py in 
_method_wrapper(self, *args, **kwargs)
515     self._self_setattr_tracking = False  # pylint: disable=protected-access
516     try:
--> 517       result = method(self, *args, **kwargs)
518     finally:
519       self._self_setattr_tracking = previous_value  # pylint: disable=protected-access

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\functional.py in 
_init_graph_network(self, inputs, outputs)
202     # Keep track of the network's nodes and layers.
203     nodes, nodes_by_depth, layers, _ = _map_graph_network(
--> 204         self.inputs, self.outputs)
205     self._network_nodes = nodes
206     self._nodes_by_depth = nodes_by_depth

~\AppData\Roaming\Python\Python36\site-packages\tensorflow\python\keras\engine\functional.py in 
_map_graph_network(inputs, outputs)
981                              'The following previous layers '
982                              'were accessed without issue: ' +
--> 983                              str(layers_with_complete_input))
984         for x in nest.flatten(node.outputs):
985           computable_tensors.add(id(x))

ValueError: Graph disconnected: cannot obtain value for tensor 
KerasTensor(type_spec=TensorSpec(shape=(None, 224, 224, 3), dtype=tf.float32, name='input_4'), 
name='input_4', description="created by layer 'input_4'") at layer "rescaling_3". The following 
previous layers were accessed without issue: []

I know that the error origins the problem that the provided tensor is not an input tensor. Is there any solution to that problem?

1.) This model is not as sequential as you try to handle it -> split_idx +1 is an add_operation that adds another layer, this must be added to the first output and added to the second input.

block3b_drop (Dropout)          (None, 28, 28, 40)   0           block3b_project_bn[0][0]         
__________________________________________________________________________________________________
block3b_add (Add)               (None, 28, 28, 40)   0           block3b_drop[0][0]               
                                                                 block3a_project_bn[0][0]         
__________________________________________________________________________________________________

2.) Add all needed inputs with given outputs:

second_input1 = keras.Input(shape=model.layers[split_idx].output.shape[1:])
second_input2 = keras.Input(shape=model.get_layer(name='block3a_project_bn').output.shape[1:])

3.) rewire the rest of the model here you need to add some things, but i give you some snippets to get you started:

for sequentially rewiring it it would be:
    tmp = [second_input1,second_input2]
    for l in range(split_idx+1, len(model.layers)):
        layer = model.layers[l]
        print(layer.name, layer.input)
        tmp = layer(tmp)

In your case this is not enough, you need to find the correct inputs, the following snipped does that. Find the correct inputs, require it to the next output (keep track of the outputs), and work your way thru the graph

for l in model.layers:
    # multiple inputs
    if type(l.input) is list:
        for li,lv in enumerate(l.input):
            print('o ', li, lv.name)
    else:
        print('- ', l.input.name)

Another, cheap way, would be -> save it as json, add your input-node, remove unused nodes there. Load the new json file, in that case you do not need to rewire.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM