簡體   English   中英

在培訓期間,我收到以下錯誤。 “ ValueError:沒有為任何變量提供梯度”

[英]Whie training I am getting the following error. “ ValueError: No gradients provided for any variable”

我計算了梯度,但我仍然沒有得到梯度錯誤。即使在查看了類似類型錯誤的答案后,我也無法弄清楚我錯過了哪里。 我正在使用 tensorflow 2。我的代碼的鏈接是https://github.com/Gadamsetty/ML_practise/blob/master/translatron_test.py

錯誤如下

ValueError: in converted code:

    <ipython-input-15-031ef53603dd>:92 train_step  *
        optimizer.apply_gradients(zip(gradients,variables))
    /tensorflow-2.0.0/python3.6/tensorflow_core/python/keras/optimizer_v2/optimizer_v2.py:427 apply_gradients
        grads_and_vars = _filter_grads(grads_and_vars)
    /tensorflow-2.0.0/python3.6/tensorflow_core/python/keras/optimizer_v2/optimizer_v2.py:1025 _filter_grads
        ([v.name for _, v in grads_and_vars],))

    ValueError: No gradients provided for any variable: ['encoder_rnn_3/layer0/forward_gru_12/kernel:0', 'encoder_rnn_3/layer0/forward_gru_12/recurrent_kernel:0', 'encoder_rnn_3/layer0/forward_gru_12/bias:0', 'encoder_rnn_3/layer0/backward_gru_12/kernel:0', 'encoder_rnn_3/layer0/backward_gru_12/recurrent_kernel:0', 'encoder_rnn_3/layer0/backward_gru_12/bias:0', 'encoder_rnn_3/layer1/forward_gru_13/kernel:0', 'encoder_rnn_3/layer1/forward_gru_13/recurrent_kernel:0', 'encoder_rnn_3/layer1/forward_gru_13/bias:0', 'encoder_rnn_3/layer1/backward_gru_13/kernel:0', 'encoder_rnn_3/layer1/backward_gru_13/recurrent_kernel:0', 'encoder_rnn_3/layer1/backward_gru_13/bias:0', 'encoder_rnn_3/layer2/forward_gru_14/kernel:0', 'encoder_rnn_3/layer2/forward_gru_14/recurrent_kernel:0', 'encoder_rnn_3/layer2/forward_gru_14/bias:0', 'encoder_rnn_3/layer2/backward_gru_14/kernel:0', 'encoder_rnn_3/layer2/backward_gru_14/recurrent_kernel:0', 'encoder_rnn_3/layer2/backward_gru_14/bias:0', 'encoder_rnn_3/layer3/forward_gru_15/kernel:0', 'encoder_rnn_3/layer3/forward_gru_15/recurrent_kernel:0', 'encoder_rnn_3/layer3/forward_gru_15/bias:0', 'encoder_rnn_3/layer3/backward_gru_15/kernel:0', 'encoder_rnn_3/layer3/backward_gru_15/recurrent_kernel:0', 'encoder_rnn_3/layer3/backward_gru_15/bias:0', 'decoder_rnn_3/bidirectional_6/forward_decoder_layer1/kernel:0', 'decoder_rnn_3/bidirectional_6/forward_decoder_layer1/recurrent_kernel:0', 'decoder_rnn_3/bidirectional_6/forward_decoder_layer1/bias:0', 'decoder_...

由於您沒有對梯度進行任何操作,因此您可以嘗試直接使用optimizer.minimize(loss)而不是先計算梯度然后單獨應用它們。 (第 159-161 行)

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM