简体   繁体   中英

Modify pretrained model in tensorflow

I want to know how to make changes to a graph loaded from tensorflow's meta and checkpoint files like:

saver = tf.train.import_meta_graph('***.meta') saver.restore(sess,tf.train.latest_checkpoint('./'))

For example, there are old_layer1 -> old_layer2 in existing graph with pretrained weights. I want to insert one then it becomes old_layer1 -> new_layer -> old_layer2 , and new_layer are randomly initialized since there are no pretrained parameter for it. Answer here said its impossible, since tf's graph only allow append, is this true?

So I wonder if this can be worked around by loading the pretrained layers as individual variables, and assigning pre-trained weights as initial values and connect them by myself, so that I can add new layers between old ones. But I don't know how to do this in code.

Doing this with raw tensorflow can be complicated since the tf graph does not encode directly the structure of the layers. If your model was built with tf.keras, however, this is fairly straightforward as loading a keras model also loads its layer structure.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM