![](/img/trans.png)
[英]Error when fiting linear binary classifier with tensorflow ValueError: No gradients provided for any variable, check your graph
[英]tensorflow No gradients provided for any variable error
我正在嘗試reuse
選項構建一個簡單的神經網絡,但出現一個奇怪的錯誤。 我不明白問題出在哪里。 也許我沒有正確使用mse
。
import tensorflow as tf
n_inputs = 8
x_ = tf.placeholder(tf.float32, [None, n_inputs])
l1 = tf.layers.dense(x_, 100, activation=tf.nn.relu, use_bias=True, name='l1', reuse=None)
l2 = tf.layers.dense(l1, 100, activation=tf.nn.relu, use_bias=True, name='l2', reuse=None)
l3 = tf.layers.dense(l2, 20, activation=tf.nn.relu, use_bias=True, name='l3', reuse=None)
y_ = tf.placeholder(tf.float32, [None, n_inputs])
w1 = tf.layers.dense(y_, 100, activation=tf.nn.relu, use_bias=True, name='l1', reuse=True)
w2 = tf.layers.dense(w1, 100, activation=tf.nn.relu, use_bias=True, name='l2', reuse=True)
w3 = tf.layers.dense(w2, 20, activation=tf.nn.relu, use_bias=True, name='l3', reuse=True)
z_ = tf.placeholder(tf.float32, [None, n_inputs])
u1 = tf.layers.dense(z_, 100, activation=tf.nn.relu, use_bias=True, name='l1', reuse=True)
u2 = tf.layers.dense(u1, 100, activation=tf.nn.relu, use_bias=True, name='l2', reuse=True)
u3 = tf.layers.dense(u2, 20, activation=tf.nn.relu, use_bias=True, name='l3', reuse=True)
mse1, _ = tf.metrics.mean_squared_error(l3, w3)
mse2, _ = tf.metrics.mean_squared_error(l3,u3)
cost = tf.subtract(mse1, mse2)
opts = tf.train.AdamOptimizer().minimize(cost)
sess = tf.InteractiveSession()
錯誤:
ValueError Traceback (most recent call last)
<ipython-input-4-0e3679c2a898> in <module>()
----> 1 __pyfile = open('''/tmp/py3823Cbm''');exec(compile(__pyfile.read(), '''/home/lpuggini/mlp/scratch/Kerberos/flow_ui.py''', 'exec'));__pyfile.close()
/home/lpuggini/mlp/scratch/Kerberos/flow_ui.py in <module>()
33 cost = tf.subtract(mse1, mse2)
34
---> 35 opts = tf.train.AdamOptimizer().minimize(cost)
36 sess = tf.InteractiveSession()
37
/home/lpuggini/MyApps/scientific_python_2_7/lib/python2.7/site-packages/tensorflow/python/training/optimizer.pyc in minimize(self, loss, global_step, var_list, gate_gradients, aggregation_method, colocate_gradi\
ents_with_ops, name, grad_loss)
320 "No gradients provided for any variable, check your graph for ops"
321 " that do not support gradients, between variables %s and loss %s." %
--> 322 ([str(v) for _, v in grads_and_vars], loss))
323
324 return self.apply_gradients(grads_and_vars, global_step=global_step,
ValueError: No gradients provided for any variable, check your graph for ops that do not support gradients, between variables ["<tf.Variable 'l1/kernel:0' shape=(8, 100) dtype=float32_ref>", "<tf.Variable 'l1/b\
ias:0' shape=(100,) dtype=float32_ref>", "<tf.Variable 'l2/kernel:0' shape=(100, 100) dtype=float32_ref>", "<tf.Variable 'l2/bias:0' shape=(100,) dtype=float32_ref>", "<tf.Variable 'l3/kernel:0' shape=(100, 20)\
dtype=float32_ref>", "<tf.Variable 'l3/bias:0' shape=(20,) dtype=float32_ref>"] and loss Tensor("Sub:0", shape=(), dtype=float32).
metrics
不是losses
。 指標會隨着時間的推移記錄某些統計數據。 通過它們進行區分是沒有意義的。 除了對指標的核心TF文檔,這里是一個很好的寫了。
你想要的是https://www.tensorflow.org/api_docs/python/tf/losses 。 更具體地說, https://www.tensorflow.org/api_docs/python/tf/losses/mean_squared_error
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.