简体   繁体   English

GradientTape 中的多个训练变量 tensorflow

[英]Multiple training variables in GradientTape tensorflow

I have the following list of trainable_variables in tensorflow:我在 tensorflow 中有以下可训练变量列表:

l_vars = [l_unary.trainable_variables, l_regularization.trainable_variables]

which is passed into the tape like so:像这样传递到磁带中:

    grad_l_model = l_tape.gradient(tf.constant(l_loss), l_vars)

However, this gives the error:但是,这给出了错误:

AttributeError: 'list' object has no attribute '_in_graph_mode'

when wrapping l_vars with tf.Variable, I get the error:用 tf.Variable 包装 l_vars 时,出现错误:

tensorflow.python.framework.errors_impl.InvalidArgumentError: Shapes of all inputs must match: values[0].shape = [5,5,3,32] != values[1].shape = [32] [Op:Pack] name: initial_value
tf.keras.layers.concatenate([l_unary.trainable_variables, l_regularization.trainable_variables], 0)

gives:给出:

ValueError: A Concatenate layer should be called on a list of at least 2 inputs ValueError:应在至少包含 2 个输入的列表上调用Concatenate

How do I train multiple model variables with tensorflow together?如何一起训练多个 model 变量和 tensorflow?

Note that your l_vars is not a list, but a list of lists, since each trainable_variables attribute is already a list in itself.请注意,您的l_vars不是一个列表,而是一个列表列表,因为每个trainable_variables属性本身已经是一个列表。 You can simply use你可以简单地使用

l_vars = l_unary.trainable_variables + l_regularization.trainable_variables

to concatenate the two lists into one big list.将两个列表连接成一个大列表。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM