简体   繁体   中英

How to put tf.layers variables in tf.name_scope/tf.variable_scope?

I have a problem with Tensorflow:

The following code produces a correct(ish) graph for a convolutional block:

def conv_layer(self, inputs, filter_size = 3, num_filters = 256, name = None):
    scope_name = name
    if name == None:
        scope_name = "conv_layer"

    with tf.name_scope(scope_name):
        conv = tf.contrib.layers.conv2d(inputs, num_filters, filter_size, activation_fn = None)
        batch_norm = tf.contrib.layers.batch_norm(conv)
        act = tf.nn.leaky_relu(batch_norm)

        return act

The problem is that the tf.layers API makes some ugly variables that do not actually stay within the name_scope . Here is the Tensorboard view so you can see what I mean.

张量板

Is there anyway to get those variables to go into the scope? This is a big problem when it comes to visualizing the graph because I plan this network to much larger. (As you can see to the right, this is already a big problem, I have to remove those from the main graph manually every time I boot up Tensorboard.)

You can try using tf.variable_scope instead. tf.name_scope is ignored by variables created via tf.get_variable() which is usually used by tf.layers functions. This is in contrast to variables created via tf.Variable .

See this question for an (albeit somewhat outdated) explanation of the differences.

Solution moved from question to an answer:

Changing each instance of name_scope with variable_scope the problem has been omitted. However, I had to assign each variable_scope with a unique ID and set reuse = False .

def conv_layer(self, inputs, filter_size = 3, num_filters = 256, name = None):
    scope_name = name
    if name == None:
        scope_name = "conv_layer_" + str(self.conv_id)
        self.conv_id += 1

    with tf.variable_scope(scope_name, reuse = False):
        conv = tf.contrib.layers.conv2d(inputs, num_filters, filter_size, activation_fn = None)
        batch_norm = tf.contrib.layers.batch_norm(conv)
        act = tf.nn.leaky_relu(batch_norm)

        return act

As you can see, the variables are nicely hidden away in the correct blocks.

解决方案

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM