I was learning linear regression with tensorflow 2.0, and I intended to use the SGD optimizer in keras tool. Here's my code.
import tensorflow as tf
from tensorflow import keras
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
%matplotlib inline
x_train = [1,2,3]
y_train = [1,2,3]
W = tf.Variable(np.random.normal([1]),name='weight')
b = tf.Variable(np.random.normal([1]),name='bias')
cost = tf.reduce_mean(tf.square(x_train*W + b-y_train))
opt = keras.optimizers.SGD(learning_rate=0.1)
fig=plt.grid()
plt.scatter(x_train,y_train)
plt.xlabel('x')
plt.ylabel('y')
for i in range(20):
plt.title('hypothesis: epoch {}'.format(i+1))
plt.plot(hypothesis, 'r.-',label='hypothesis')
plt.legend(loc='best')
opt.minimize(cost, var_list=[W,b])
I intended to print a plot for every single epoch, but I got this error at th last sentence of the loop.
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-8-be257fb20d71> in <module>
8 plt.plot(hypothesis, 'r.-',label='hypothesis')
9 plt.legend(loc='best')
---> 10 opt.minimize(cost, var_list=[W,b])
~\anaconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\keras\optimizer_v2\optimizer_v2.py in minimize(self, loss, var_list, grad_loss, name)
315 """
316 grads_and_vars = self._compute_gradients(
--> 317 loss, var_list=var_list, grad_loss=grad_loss)
318
319 return self.apply_gradients(grads_and_vars, name=name)
~\anaconda3\envs\tensorflow\lib\site-packages\tensorflow_core\python\keras\optimizer_v2\optimizer_v2.py in _compute_gradients(self, loss, var_list, grad_loss)
349 if not callable(var_list):
350 tape.watch(var_list)
--> 351 loss_value = loss()
352 if callable(var_list):
353 var_list = var_list()
TypeError: 'tensorflow.python.framework.ops.EagerTensor' object is not callable
How can I solve this?
Your cost: cost = tf.reduce_mean(tf.square(x_train*W + b-y_train))
is actually a tensor, but the cost argument in opt.minimize(cost, var_list=[W,b])
requires a function.
So, you should create cost as a function instead of a tensor by calling lambda:
cost = lambda: tf.reduce_mean(tf.square(x_train*W + b-y_train))
And, you should train for longer epochs.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.