简体   繁体   中英

How to to feed keras pre-trained model to computational graph?

lately I've been playing around with CNNs written in Keras, TF. Unfortunately I've come across one problem there:

In these magnificent tutorials ( https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/vgg16.py ; rest of the code is here), which I highly recommend, maestro loads pre-trained vgg16.tfmodel in very, very ugly way.

def __init__(self):
    # Now load the model from file. The way TensorFlow
    # does this is confusing and requires several steps.

    # Create a new TensorFlow computational graph.
    self.graph = tf.Graph()

    # Set the new graph as the default.
    with self.graph.as_default():

        # TensorFlow graphs are saved to disk as so-called Protocol Buffers
        # aka. proto-bufs which is a file-format that works on multiple
        # platforms. In this case it is saved as a binary file.

        # Open the graph-def file for binary reading.
        path = os.path.join(data_dir, path_graph_def)
        with tf.gfile.FastGFile(path, 'rb') as file:
            # The graph-def is a saved copy of a TensorFlow graph.
            # First we need to create an empty graph-def.
            graph_def = tf.GraphDef()

            # Then we load the proto-buf file into the graph-def.
            graph_def.ParseFromString(file.read())

            # Finally we import the graph-def to the default TensorFlow graph.
            tf.import_graph_def(graph_def, name='')

            # Now self.graph holds the VGG16 model from the proto-buf file.

        # Get a reference to the tensor for inputting images to the graph.
        self.input = self.graph.get_tensor_by_name(self.tensor_name_input_image)

        # Get references to the tensors for the commonly used layers.
        self.layer_tensors = [self.graph.get_tensor_by_name(name + ":0") for name in self.layer_names]

And the problem is- I want my own pre-trained model load in the same/similiar way, so I can put the model into the graph of the class I am calling later and if possible, get the last lines of the code here working.(Meaning getting the tensors of wanted layers from the graph.)

All my tries based on load_model imported from keras and the comp. graph failed me. Also I didn't want to load it in a completely different way, because I would have to change A LOT of code afterwards- for a newbie kind of a big problem.

Ok, I hope the question will reach the right person and also that it is not too trivial for you :D.

BTW: The complex problem I'm solving, for you to make picture is the style transfer also in the same github repository. ( https://github.com/Hvass-Labs/TensorFlow-Tutorials/blob/master/15_Style_Transfer.ipynb )

So you want to load in a keras model into tensorflow basically? Can easily be done with the following code:

import keras.backend as k
from keras.models import load_model
import tensorflow as tf

model = load_model("your model.h5")  # now it's in the memory of keras
with k.get_session() as sess:
    # here you have a tensorflow computational graph, view it by:
    tf.summary.FileWriter("folder name", sess.graph)

    # if you need a certain tensor do:
    sess.graph.get_tensor_by_name("tensor name") 

To have a little read about the get_session function, click here

to view the graph, you need to load in the folder from the FileWriter with tensorboard like this :

tensorboard --logdir path/to/folder

Hope this provided some help, good luck!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM