简体   繁体   中英

Whie training I am getting the following error. “ ValueError: No gradients provided for any variable”

I calculated gradients but still I am getting no gradient error.I am not able to figure out where I am missing even after looking at answers to similar type of errors. I am using tensorflow 2. The link to my code is https://github.com/Gadamsetty/ML_practise/blob/master/translatron_test.py

the error is as follows

ValueError: in converted code:

    <ipython-input-15-031ef53603dd>:92 train_step  *
        optimizer.apply_gradients(zip(gradients,variables))
    /tensorflow-2.0.0/python3.6/tensorflow_core/python/keras/optimizer_v2/optimizer_v2.py:427 apply_gradients
        grads_and_vars = _filter_grads(grads_and_vars)
    /tensorflow-2.0.0/python3.6/tensorflow_core/python/keras/optimizer_v2/optimizer_v2.py:1025 _filter_grads
        ([v.name for _, v in grads_and_vars],))

    ValueError: No gradients provided for any variable: ['encoder_rnn_3/layer0/forward_gru_12/kernel:0', 'encoder_rnn_3/layer0/forward_gru_12/recurrent_kernel:0', 'encoder_rnn_3/layer0/forward_gru_12/bias:0', 'encoder_rnn_3/layer0/backward_gru_12/kernel:0', 'encoder_rnn_3/layer0/backward_gru_12/recurrent_kernel:0', 'encoder_rnn_3/layer0/backward_gru_12/bias:0', 'encoder_rnn_3/layer1/forward_gru_13/kernel:0', 'encoder_rnn_3/layer1/forward_gru_13/recurrent_kernel:0', 'encoder_rnn_3/layer1/forward_gru_13/bias:0', 'encoder_rnn_3/layer1/backward_gru_13/kernel:0', 'encoder_rnn_3/layer1/backward_gru_13/recurrent_kernel:0', 'encoder_rnn_3/layer1/backward_gru_13/bias:0', 'encoder_rnn_3/layer2/forward_gru_14/kernel:0', 'encoder_rnn_3/layer2/forward_gru_14/recurrent_kernel:0', 'encoder_rnn_3/layer2/forward_gru_14/bias:0', 'encoder_rnn_3/layer2/backward_gru_14/kernel:0', 'encoder_rnn_3/layer2/backward_gru_14/recurrent_kernel:0', 'encoder_rnn_3/layer2/backward_gru_14/bias:0', 'encoder_rnn_3/layer3/forward_gru_15/kernel:0', 'encoder_rnn_3/layer3/forward_gru_15/recurrent_kernel:0', 'encoder_rnn_3/layer3/forward_gru_15/bias:0', 'encoder_rnn_3/layer3/backward_gru_15/kernel:0', 'encoder_rnn_3/layer3/backward_gru_15/recurrent_kernel:0', 'encoder_rnn_3/layer3/backward_gru_15/bias:0', 'decoder_rnn_3/bidirectional_6/forward_decoder_layer1/kernel:0', 'decoder_rnn_3/bidirectional_6/forward_decoder_layer1/recurrent_kernel:0', 'decoder_rnn_3/bidirectional_6/forward_decoder_layer1/bias:0', 'decoder_...

Since you are not doing any operation on the gradients, you can try to use directly optimizer.minimize(loss) rather than first computing gradients and then applying them separately. (lines 159-161)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM