简体   繁体   中英

Replacing a node in graph with custom op having variable dependency in tensorflow

I am trying to replace the computation done in the graph with a custom op that does the same.

Lets say the graph has a constant A and weight variable W , I create the custom op to take these two inputs and do the entire computation (except the last step of weight update):

custom_op_tensor = custom_module.custom_op([A,W])
g_def = tf.get_default_graph().as_graph_def()
input_map = { tensor.name : custom_op_tensor }
train_op, = tf.import_graph_def(g_def, input_map=input_map, return_elements=[train_op])

After the import graph def, there are two W 's, one from the original graph def and one in the imported graph. When we run the train op, the custom op ends up reading the old W and the new W is updated. As a result, the gradient descent ends up failing to do the right thing.

The problem is instantiation of custom_op requires the input weight tensor W . The new W is known only after the import. And, import requires the custom op. How does one get around this problem ?

Could you precise which version of Tensorflow you use : r0.08, r0.09, r0.10, r0.11 ?

That is impossible to change an op in the graph with another op. But If you may access W, you can still make a backup copy of W (using deepcopy() from copy module ) before running the train op which update it ?

Regards

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM