简体   繁体   中英

How can I rename the input tensor name of an op in Tensorflow?

My graph definition before removing the dropout layers looks like this :

fc6/BiasAdd : BiasAdd ( [u'fc6/Conv2D', u'fc6/biases/read'] )
fc6/Relu : Relu ( [u'fc6/BiasAdd'] )
dropout/keep_prob : Const ( [] )
dropout/Shape : Shape ( [u'fc6/Relu'] )
dropout/random_uniform/min : Const ( [] )
dropout/random_uniform/max : Const ( [] )
dropout/random_uniform/RandomUniform : RandomUniform ( [u'dropout/Shape'] )
dropout/random_uniform/sub : Sub ( [u'dropout/random_uniform/max', u'dropout/random_uniform/min'] )
dropout/random_uniform/mul : Mul ( [u'dropout/random_uniform/RandomUniform', u'dropout/random_uniform/sub'] )
dropout/random_uniform : Add ( [u'dropout/random_uniform/mul', u'dropout/random_uniform/min'] )
dropout/add : Add ( [u'dropout/keep_prob', u'dropout/random_uniform'] )
dropout/Floor : Floor ( [u'dropout/add'] )
dropout/Inv : Inv ( [u'dropout/keep_prob'] )
dropout/mul : Mul ( [u'fc6/Relu', u'dropout/Inv'] )
dropout/mul_1 : Mul ( [u'dropout/mul', u'dropout/Floor'] )
fc7/weights : Const ( [] )
fc7/weights/read : Identity ( [u'fc7/weights'] )
fc7/Conv2D : Conv2D ( [u'dropout/mul_1', u'fc7/weights/read'] )

in format node.name : node.type node.input

After removing the dropout layers, I must find out how can I change the input tensor name of a specific layer. After removing dropout layers the graph looks like this :

fc6/BiasAdd : BiasAdd ( [u'fc6/Conv2D', u'fc6/biases/read'] )
fc6/Relu : Relu ( [u'fc6/BiasAdd'] )
fc7/weights : Const ( [] )
fc7/weights/read : Identity ( [u'fc7/weights'] )
fc7/Conv2D : Conv2D ( [u'dropout/mul_1', u'fc7/weights/read'] )

But, as you can see the fc7/Conv2D operation still expects dropout/mul_1 as an input. Because of that I am getting this error :

ValueError: graph_def is invalid at node u'fc7/Conv2D': Input tensor 'dropout/mul_1:0' not found in graph_def..

I want to change expected input tensor name of the node - operation to fc6/BiasAdd , for the network to be valid. Is there a way to do that?

There is no straightforward way of doing something like that. In general, the computation graph can be augmented with new operations, but the existing nodes cannot be modified. There are three possible paths you can follow:

  • The easiest thing would be to leave the dropout layer as it is, and simply pass a constant keep_prob or 1 (for example using a tf.placeholder_with_default ). You will still have some minor overhead (I think, I don't know if the implementation of dropout bypasses the operation with a probability of 1), but it will probably be unnoticeable.
  • Make a copy of the graph in another tf.Graph object without the dropout layers and then copy the variable values from the session in the first one to a session in the new one (eg with load() ).
  • Actually edit the graph. Although not its main intended usage, it is possible to edit the graph to some extent. The model tf.contrib.graph_editor implements a number of operations to that end. In your case, you are probably looking for something like tf.contrib.graph_editor.swap_inputs . The drawback here is that these operations must be done "offline", that is, with no active sessions using the graph. That means that variable values would in principle not be saved. You can either checkpoint the model , manually save the variable values to NumPy arrays and restore them after the graph is modified or, if you are done training and only intend to use your model for inference, you can also freeze the graph . In any case, you have to take care of it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM