简体   繁体   English

Tensorflow tf.layers,tf.contrib.layers不适用于可变范围

[英]Tensorflow tf.layers, tf.contrib.layers not working with variable scope

I am beginning to use TensorFlow for some simple Q-learning, but have run into trouble when trying to use variable scopes with layers constructed using tf.layers and tf.contrib.layers . 我开始使用TensorFlow进行一些简单的Q学习,但是在尝试将可变范围与使用tf.layerstf.contrib.layers构造的层一起使用时遇到了麻烦。 In a nutshell, I want to apply the same layers to different input tensors (for example, to hold the current and next Q values). 简而言之,我想将相同的图层应用于不同的输入张量(例如,保存当前和下一个Q值)。 Here is a minimal example using tf.layers : 这是使用tf.layers的最小示例:

import tensorflow as tf

inp1 = tf.placeholder(tf.float64, (4,1))
inp2 = tf.placeholder(tf.float64, (4,1))

def process(inp):
    with tf.variable_scope("foo", reuse=True):
        return tf.layers.dense(inp, 12, name="bar", reuse=True)

process(inp1)
process(inp2)

Trying to execute this code gives the following exception: 尝试执行此代码会产生以下异常:

ValueError: Variable foo/bar/kernel does not exist, or was not created with
tf.get_variable(). Did you mean to set reuse=None in VarScope?

I understand that setting reuse=True in tf.layers.dense() makes it try to find an already defined layer, which it may fail to do. 我了解到,在tf.layers.dense()中设置reuse=True会使它尝试查找已经定义的图层,而这可能会失败。 But if I change the call into tf.layers.dense(inp, 12, name="bar") , then it fails with the same exception. 但是,如果我将调用更改为tf.layers.dense(inp, 12, name="bar") ,则它将失败,并出现相同的异常。

If I set reuse=None in tf.variable_scope() , then the latter version fails during the call of process(inp2) with the exception: 如果我在tf.variable_scope()设置了reuse=None ,则后一个版本在调用process(inp2)期间会失败,但以下情况除外:

ValueError: Variable foo/bar/kernel already exists, disallowed. 
Did you mean to set reuse=True in VarScope?

Unfortunately, similar errors occur when using tf.contrib.layers . 不幸的是,使用tf.contrib.layers时会发生类似的错误。

My question is: Is there a way to make tf.layers work with variable scopes? 我的问题是:有没有办法使tf.layers在可变范围内工作? I know that I could define the weights and biases separately, but it would be nice to retain the abstraction given by tf.layers . 我知道我可以分别定义权重和偏差,但是最好保留tf.layers给出的抽象。 Thanks a lot! 非常感谢!

My setup is TensorFlow 1.3.0 (CPU) running with Python 3.6.1 on Windows 10 (installed through pip on 64-bit Anaconda 4.4.0). 我的设置是在Windows 10上运行Python 3.6.1的TensorFlow 1.3.0(CPU)(通过pip安装在64位Anaconda 4.4.0上)。

PS I found the use of variable scopes for layers on page 17 of this presentation . PS我在本演示文稿的第17页上发现了对层使用可变范围。

Two errors are different: the first one happened in process(inp1) , where it tries to find the existed variables but there is not; 有两个错误是不同的:第一个错误发生在process(inp1) ,它试图找到存在的变量,但是没有找到; the second happened in process(inp2) , where the variable with same name existed but it tries to create a new variable with the same name, disallowed. 第二个发生在process(inp2) ,其中存在具有相同名称的变量,但是它试图创建一个具有相同名称的新变量,不允许这样做。

I guess that you want to reuse those variables for Q-learning. 我想您想重用这些变量进行Q学习。 So the solution is quite simple: the first time you define those variables, don't use reuse , then you can set reuse=True . 因此解决方案非常简单:第一次定义这些变量时,不要使用reuse ,然后可以设置reuse=True

In the presentation you gave, I guess they have already defined variables before. 在您的演示文稿中,我想它们之前已经定义了变量。

This guide will help you under more. 指南将为您提供更多帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM