简体   繁体   English

如何使用张量流与系列输出进行回归?

[英]How to do regression using tensorflow with series output?

I want to build a regression model with 2 output nodes using tensorflow. 我想使用tensorflow构建具有2个输出节点的回归模型。 I search a code which can build regression model but with 1 output nodes. 我搜索一个可以构建回归模型但具有1个输出节点的代码。

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/skflow/boston.py https://github.com/tensorflow/tensorflow/blob/master/tensorflow/examples/skflow/boston.py

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from sklearn import cross_validation
from sklearn import metrics
from sklearn import preprocessing
import tensorflow as tf
from tensorflow.contrib import learn


def main(unused_argv):
  # Load dataset
  boston = learn.datasets.load_dataset('boston')
  x, y = boston.data, boston.target

  # Split dataset into train / test
  x_train, x_test, y_train, y_test = cross_validation.train_test_split(
      x, y, test_size=0.2, random_state=42)

  # Scale data (training set) to 0 mean and unit standard deviation.
  scaler = preprocessing.StandardScaler()
  x_train = scaler.fit_transform(x_train)

  # Build 2 layer fully connected DNN with 10, 10 units respectively.
  feature_columns = learn.infer_real_valued_columns_from_input(x_train)
  regressor = learn.DNNRegressor(
      feature_columns=feature_columns, hidden_units=[10, 10])

  # Fit
  regressor.fit(x_train, y_train, steps=5000, batch_size=1)

  # Predict and score
  y_predicted = list(
      regressor.predict(scaler.transform(x_test), as_iterable=True))
  score = metrics.mean_squared_error(y_predicted, y_test)

  print('MSE: {0:f}'.format(score))


if __name__ == '__main__':
  tf.app.run()

I am new to tensorflow, so I searched for the code which has similarity to how mine works, but the output of the code is one. 我是tensorflow的新手,所以我搜索了与我的工作方式相似的代码,但是代码的输出是其中之一。

In my model, the input is N*1000, and the output is N*2. 在我的模型中,输入为N * 1000,输出为N * 2。 I wonder are there effective and efficient code for regression. 我想知道是否有有效且高效的回归代码。 Please give me some example. 请举个例子。

Actually, I find a workable code using DNNRegressor: 实际上,我使用DNNRegressor找到了可行的代码:

import numpy as np
from sklearn.cross_validation import train_test_split
from tensorflow.contrib import learn
import tensorflow as tf
import logging
#logging.getLogger().setLevel(logging.INFO)

#Some fake data

N=200
X=np.array(range(N),dtype=np.float32)/(N/10)
X=X[:,np.newaxis]

#Y=np.sin(X.squeeze())+np.random.normal(0, 0.5, N)
Y = np.zeros([N,2])
Y[:,0] = X.squeeze()
Y[:,1] = X.squeeze()**2

X_train, X_test, Y_train, Y_test = train_test_split(X, Y,
                                                    train_size=0.8,
                                                    test_size=0.2)


reg=learn.DNNRegressor(hidden_units=[10,10])
reg.fit(X_train,Y_train[:,0],steps=500)

But, this code will work only if the shape of Y_train is N*1, and it will fail when the shape of Y_train is N*2. 但是,此代码仅在Y_train的形状为N * 1时有效,而在Y_train的形状为N * 2时失败。

However, I want to build a regress model and the input is N*1000, the output is N*2. 但是,我想建立一个回归模型,输入为N * 1000,输出为N * 2。 And I can't fix it. 而且我无法解决。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM