![](/img/trans.png)
[英]What is mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope?
[英]Tensorflow's variable_scope() and tf.AUTO_REUSE will not reuse variables in a for loop
我想將幾個不同的輸入傳遞給可重用的張量流體系結構(解碼器)。 為此,我使用了一個for循環,在其中將我的輸入輸入模型。 但是,我無法重用圖層變量,而是為每次循環迭代創建變量。 假設此代碼:
import tensorflow as tf
for i in range(5):
decoder(input=input, is_training=is_training)
而解碼器是:
def decoder(self, input, is_training):
with tf.variable_scope("physics", reuse=tf.AUTO_REUSE):
latent = tf.expand_dims(latent, axis=1)
latent = tf.expand_dims(latent, axis=1)
x = latent
""" Layer 1 """
x = tf.layers.conv2d_transpose(x, filters=256, kernel_size=2, strides=1, activation='relu', padding='valid', name="transpose1_1", reuse=tf.AUTO_REUSE)
x = tf.layers.batch_normalization(x, training=is_training, name="transpose_bn_1_1")
""" Layer 2 """
x = tf.layers.conv2d_transpose(x, filters=256, kernel_size=2, strides=2, activation='relu', padding='valid', name="transpose1_2", reuse=tf.AUTO_REUSE)
x = tf.layers.batch_normalization(x, training=is_training, name="transpose_bn_1_2")
...
如果我現在在循環之后立即輸出變量
from pprint import pprint
pprint([n.name for n in tf.get_default_graph().as_graph_def().node])
我得到以下輸出,表明我不在循環迭代中共享變量:
'physics/transpose1_1/kernel/Initializer/random_uniform/shape',
'physics/transpose1_1/kernel/Initializer/random_uniform/min',
'physics/transpose1_1/kernel/Initializer/random_uniform/max',
'physics/transpose1_1/kernel/Initializer/random_uniform/RandomUniform',
'physics/transpose1_1/kernel/Initializer/random_uniform/sub',
'physics/transpose1_1/kernel/Initializer/random_uniform/mul',
'physics/transpose1_1/kernel/Initializer/random_uniform',
'physics/transpose1_1/kernel',
'physics/transpose1_1/kernel/Assign',
'physics/transpose1_1/kernel/read',
'physics/transpose1_1/bias/Initializer/zeros',
'physics/transpose1_1/bias',
'physics/transpose1_1/bias/Assign',
'physics/transpose1_1/bias/read',
'physics/transpose1_1/Shape',
'physics/transpose1_1/strided_slice/stack',
'physics/transpose1_1/strided_slice/stack_1',
'physics/transpose1_1/strided_slice/stack_2',
'physics/transpose1_1/strided_slice',
'physics/transpose1_1/strided_slice_1/stack',
'physics/transpose1_1/strided_slice_1/stack_1',
'physics/transpose1_1/strided_slice_1/stack_2',
'physics/transpose1_1/strided_slice_1',
'physics/transpose1_1/strided_slice_2/stack',
'physics/transpose1_1/strided_slice_2/stack_1',
'physics/transpose1_1/strided_slice_2/stack_2',
'physics/transpose1_1/strided_slice_2',
'physics/transpose1_1/mul/y',
'physics/transpose1_1/mul',
'physics/transpose1_1/add/y',
'physics/transpose1_1/add',
'physics/transpose1_1/mul_1/y',
'physics/transpose1_1/mul_1',
'physics/transpose1_1/add_1/y',
'physics/transpose1_1/add_1',
'physics/transpose1_1/stack/3',
'physics/transpose1_1/stack',
'physics/transpose1_1/conv2d_transpose',
'physics/transpose1_1/BiasAdd',
'physics/transpose1_1/Relu',
...
'physics_4/transpose1_1/Shape',
'physics_4/transpose1_1/strided_slice/stack',
'physics_4/transpose1_1/strided_slice/stack_1',
'physics_4/transpose1_1/strided_slice/stack_2',
'physics_4/transpose1_1/strided_slice',
'physics_4/transpose1_1/strided_slice_1/stack',
'physics_4/transpose1_1/strided_slice_1/stack_1',
'physics_4/transpose1_1/strided_slice_1/stack_2',
'physics_4/transpose1_1/strided_slice_1',
'physics_4/transpose1_1/strided_slice_2/stack',
'physics_4/transpose1_1/strided_slice_2/stack_1',
'physics_4/transpose1_1/strided_slice_2/stack_2',
'physics_4/transpose1_1/strided_slice_2',
'physics_4/transpose1_1/mul/y',
'physics_4/transpose1_1/mul',
'physics_4/transpose1_1/add/y',
'physics_4/transpose1_1/add',
'physics_4/transpose1_1/mul_1/y',
'physics_4/transpose1_1/mul_1',
'physics_4/transpose1_1/add_1/y',
'physics_4/transpose1_1/add_1',
'physics_4/transpose1_1/stack/3',
'physics_4/transpose1_1/stack',
'physics_4/transpose1_1/conv2d_transpose',
'physics_4/transpose1_1/BiasAdd',
'physics_4/transpose1_1/Relu',
這里發生了什么? tf.AUTO_REUSE
標志不應該允許我在i==0
時首先初始化我的decoder
,並且對於所有迭代i>0
重用我的變量嗎? 上面的代碼對於我在解碼器中具有的每一層都會重復出現。
我正在使用TensorFlow版本1.12.0
。
謝謝。
您已經在for循環中重用了變量。 圖的節點不等同於Variable
。 以下示例具有多個節點,但只有一個Variable
。
import tensorflow as tf
a = tf.Variable([2.0],name='a')
b = a+1
print([n.name for n in tf.get_default_graph().as_graph_def().node])
['a/initial_value', 'a', 'a/Assign', 'a/read', 'add/y', 'add']
您應該使用其他方式在代碼中查看變量。
1. if "Variable" in n.op
結束時if "Variable" in n.op
print([n.name for n in tf.get_default_graph().as_graph_def().node if "Variable" in n.op])
['a']
2.使用tf.global_variables()
。
print(tf.global_variables())
[<tf.Variable 'a:0' shape=(1,) dtype=float32_ref>]
因此,您應該在代碼中執行以下操作:
import tensorflow as tf
def decoder(latent, is_training):
with tf.variable_scope("physics", reuse=tf.AUTO_REUSE):
x = latent
""" Layer 1 """
x = tf.layers.conv2d_transpose(x, filters=256, kernel_size=2, strides=1, activation='relu', padding='valid', name="transpose1_1", reuse=tf.AUTO_REUSE)
x = tf.layers.batch_normalization(x, training=is_training, name="transpose_bn_1_1")
""" Layer 2 """
x = tf.layers.conv2d_transpose(x, filters=256, kernel_size=2, strides=2, activation='relu', padding='valid', name="transpose1_2", reuse=tf.AUTO_REUSE)
x = tf.layers.batch_normalization(x, training=is_training, name="transpose_bn_1_2")
for i in range(5):
decoder(latent=tf.ones(shape=[64,7,7,256]) , is_training=True)
print([n.name for n in tf.get_default_graph().as_graph_def().node if "Variable" in n.op])
# print(tf.global_variables())
['physics/transpose1_1/kernel', 'physics/transpose1_1/bias', 'physics/transpose_bn_1_1/gamma', 'physics/transpose_bn_1_1/beta', 'physics/transpose_bn_1_1/moving_mean', 'physics/transpose_bn_1_1/moving_variance', 'physics/transpose1_2/kernel', 'physics/transpose1_2/bias', 'physics/transpose_bn_1_2/gamma', 'physics/transpose_bn_1_2/beta', 'physics/transpose_bn_1_2/moving_mean', 'physics/transpose_bn_1_2/moving_variance']
TF從圖層名稱構建一個變量名稱。 然后,當它嘗試創建變量時,它將檢查變量是否已經存在。 如果這樣做,它將引發異常,除非您指定可以重用變量。
要修復代碼,您需要在應該共享變量的圖層中使用相同的名稱。 文檔中也有相同說明:
reuse: Boolean, whether to reuse the weights of a previous layer by the same name.
此外,要調試代碼並確保var指向相同的位置,您只需刪除reuse
參數並確保在嘗試運行模型時出現異常。
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.