简体   繁体   English

在Tensorflow中获取线性回归的系数

[英]Get coefficients of a linear regression in Tensorflow

I've done a simple linear regression in Tensorflow. 我在Tensorflow中做了一个简单的线性回归。 How can I know what are the coefficients of the regression? 我怎么知道回归的系数是多少? I've read the docs but I cannot find it anywhere! 我已经阅读了文档,但我无法在任何地方找到它! ( https://www.tensorflow.org/api_docs/python/tf/estimator/LinearRegressor ) https://www.tensorflow.org/api_docs/python/tf/estimator/LinearRegressor

EDIT Code example 编辑代码示例

import numpy as np
import tensorflow as tf

# Declare list of features, we only have one real-valued feature
def model_fn(features, labels, mode):
  # Build a linear model and predict values
  W = tf.get_variable("W", [1], dtype=tf.float64)
  b = tf.get_variable("b", [1], dtype=tf.float64)
  y = W * features['x'] + b
  # Loss sub-graph
  loss = tf.reduce_sum(tf.square(y - labels))
  # Training sub-graph
  global_step = tf.train.get_global_step()
  optimizer = tf.train.GradientDescentOptimizer(0.01)
  train = tf.group(optimizer.minimize(loss),
                   tf.assign_add(global_step, 1))
  # EstimatorSpec connects subgraphs we built to the
  # appropriate functionality.
  return tf.estimator.EstimatorSpec(
      mode=mode,
      predictions=y,
      loss=loss,
      train_op=train)

estimator = tf.estimator.Estimator(model_fn=model_fn)
# define our data sets
x_train = np.array([1., 2., 3., 4.])
y_train = np.array([0., -1., -2., -3.])
x_eval = np.array([2., 5., 8., 1.])
y_eval = np.array([-1.01, -4.1, -7, 0.])
input_fn = tf.estimator.inputs.numpy_input_fn(
    {"x": x_train}, y_train, batch_size=4, num_epochs=None, shuffle=True)
train_input_fn = tf.estimator.inputs.numpy_input_fn(
    {"x": x_train}, y_train, batch_size=4, num_epochs=1000, shuffle=False)
eval_input_fn = tf.estimator.inputs.numpy_input_fn(
    {"x": x_eval}, y_eval, batch_size=4, num_epochs=1000, shuffle=False)

# train
estimator.train(input_fn=input_fn, steps=1000)
# Here we evaluate how well our model did.
train_metrics = estimator.evaluate(input_fn=train_input_fn)
eval_metrics = estimator.evaluate(input_fn=eval_input_fn)
print("train metrics: %r"% train_metrics)
print("eval metrics: %r"% eval_metrics)

EDIT: As Jason Ching points out, there have been some changes after this answer was posted. 编辑:正如Jason Ching所指出的那样,这个答案发布后有一些变化。 There are now the estimator methods get_variable_names and get_variable_value , and the estimator weights do not seem to be automatically added to tf.GraphKeys.MODEL_VARIABLES anymore. 现在有估计器方法get_variable_namesget_variable_value ,估计器权重似乎不再自动添加到tf.GraphKeys.MODEL_VARIABLES


Estimators are designed to work basically as a black box, so there is no direct API to retrieve the weights. 估算器设计为基本上作为黑盒子工作,因此没有直接的API来检索权重。 Even if, as in your case, you are the one defining the model (as opposed to using a preexisting model), you do not have a direct access to the parameters from the estimator object. 即使在您的情况下,您是定义模型的那个(而不是使用预先存在的模型),您也无法直接访问估算器对象中的参数。

That said, you can still retrieve the variables back through other means. 也就是说,您仍然可以通过其他方式检索变量。 If you know the names of the variables, one option is to simply get them from the graph object with get_operation_by_name or get_tensor_by_name . 如果您知道变量的名称,则只需使用get_operation_by_nameget_tensor_by_name从图形对象中获取它们。 A more practical and general option is to use a collection. 更实用和更通用的选择是使用集合。 Either when you call tf.get_variable or after that, calling tf.add_to_collection , you can put the model variables under a common collection name for later retrieval. 无论是调用tf.get_variable还是之后调用tf.add_to_collection ,都可以将模型变量放在一个通用的集合名称下,以便以后检索。 If you look at how a tf.estimator.LinearRegressor is actually built (search for the function linear_model in this module ), all model variables are added to both tf.GraphKeys.GLOBAL_VARIABLES and tf.GraphKeys.MODEL_VARIABLES . 如果你看一下如何tf.estimator.LinearRegressor实际上是建立(搜索功能linear_model这个模块 ),所有模型变量添加到了两个tf.GraphKeys.GLOBAL_VARIABLEStf.GraphKeys.MODEL_VARIABLES This is (presumably, I haven't really checked) common to all the available canned estimators, so usually when using one of those you should be able to simply do: 对于所有可用的罐装估算器来说,这是(可能是我没有真正检查过),所以通常在使用其中一个时,你应该能够做到:

model_vars = tf.get_collection(tf.GraphKeys.MODEL_VARIABLES)

It is preferable that you use tf.GraphKeys.MODEL_VARIABLES in this case instead of tf.GraphKeys.GLOBAL_VARIABLES , which has a more general purpose and is likely to contain other unrelated variables as well. 在这种情况下,最好使用tf.GraphKeys.MODEL_VARIABLES而不是tf.GraphKeys.GLOBAL_VARIABLES ,它具有更多通用性,并且可能还包含其他不相关的变量。

Try with this: 试试这个:

LR.train(input_fn=train_input_data,steps = 1)

with tf.Session() as sess:
    last_check = tf.train.latest_checkpoint(tf_data)
    saver = tf.train.import_meta_graph(last_check + '.meta')
    print (last_check +'.meta')
    saver.restore(sess, last_check)
    ######
    Model_variables = tf.GraphKeys.MODEL_VARIABLES
    Global_Variables = tf.GraphKeys.GLOBAL_VARIABLES
    ######
    all_vars = tf.get_collection(Model_variables)
    # print (all_vars)
    for i in all_vars:
        print (str(i) + '  -->  '+ str(i.eval()))

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM