简体   繁体   English

使用tf.Variable()和tf.get_variable()时的结果不同

[英]Different outcomes when using tf.Variable() and tf.get_variable()

I'm trying to get familiar with TensorFlow framework from this site by playing around with Linear Regression (LR). 我正在尝试通过线性回归(LR)来熟悉本站点的TensorFlow框架。 The source code for LR can be found here , with the name 03_linear_regression_sol.py . LR的源代码可以在这里找到,名称为03_linear_regression_sol.py

Generally, the defined model for LR is Y_predicted = X * w + b , where 通常,LR的定义模型为Y_predicted = X * w + b ,其中

  • w and b are parameters ( tf.Variable ) wb是参数( tf.Variable
  • Y_predicted and X are training data ( placeholder ) Y_predictedX是训练数据( placeholder

For w and b , in the sample code, they are defined as following 对于wb ,在示例代码中,它们的定义如下

w = tf.Variable(0.0, name='weights')
b = tf.Variable(0.0, name='bias')

And I changed these two lines of code a little bit as following 我如下更改了这两行代码

w = tf.get_variable('weights', [], dtype=tf.float32)
b = tf.get_variable('bias', [], dtype=tf.float32)

For this experiment, I got two different total_loss/n_samples for those two versions. 对于本实验,对于这两个版本,我得到了两个不同的total_loss/n_samples More specifically, in the original version, I got a deterministic result at anytime, 1539.0050282141283 . 更具体地说,在原始版本中,我随时可以得到1539.0050282141283的确定性结果。 But, in the modified version, I got undeterministic results at different running time, for example, total_loss/n_samples could be 1531.3039793868859 , 1526.3752814714044 , ... etc. 但是,在修改后的版本,我在不同的运行时间undeterministic的结果,例如, total_loss/n_samples可能是1531.30397938688591526.3752814714044 ,...等等。

What is the difference between tf.Variable() and tf.get_variable() ? tf.Variable()tf.get_variable()什么tf.get_variable()

tf.Variable accepts an initial value upon creation (a constant), this explains deterministic results when you use it. tf.Variable在创建时接受初始值(常数),这说明了使用它时的确定性结果。

tf.get_variable is slightly different: it has an initializer argument, by default None , which is interpreted like this: tf.get_variable稍有不同:它具有一个initializer参数,默认情况下为None ,其解释如下:

If initializer is None (the default), the default initializer passed in the variable scope will be used. 如果initializerNone (默认值),则将使用在变量范围中传递的默认初始值设定项。 If that one is None too, a glorot_uniform_initializer will be used. 如果那个也为None ,则将使用glorot_uniform_initializer

Since you didn't pass an initializer, the variable got uniform random initial value. 由于您没有通过初始化程序,因此该变量具有统一的随机初始值。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM