I am coding right now an DRL algorithm that uses the principle of Double DQN with a model and a target which gets updated over time.
So in my code i go this line:
self.q_next.set_weights(self.q_eval.get_weights())
Which results in the following error message:
ValueError: You called set_weights(weights)
on layer "d3qn_1" with a weight list of length 10, but the layer was expecting 0 weights. Provided weights: [array([[[[ 0.04574016, 0.03492326, -0.04824715, ...
Why can i not set the weights of one network to the other?
I fixed this by building the q_next model first, but you need to know the dimension of your inputs ahead.
For example:
self.target_model.build((None, 80, 80, 1))
self.target_model.set_weights(self.model.get_weights())
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.