简体   繁体   中英

how to fetch variables list of tf.layers

I want to apply gradient on the specified variables of one layer. So requires variable list to pass as argument:var_list into optimizer.minimize .But I don't know how to fetch them.

such as:

a = tf.layers.conv2d(input, 3, 3, padding='same', name='a')
b = tf.layers.conv2d(a, 1, 3, padding='same', name='b')
loss = tf.reduce_mean(tf.pow(b-1,2))
optimizer = tf.train.GradientDescentOptimizer()
train_op = optimizer.minimize(loss,var_list=???)

I just want to train the kernel variables, weight and bias of layer b , and keep layer a untouched.

how can I do it. Or should I use lower level to implement this?

Well, you want to train all the parameters of layer b (there are only weights and biases), but want to keep the parameters of layer a as it is (as you described), then you can pass trainable=False parameter to tf.layers.conv2d .

But if you want more control over the variables, you can manually select the variables to train after printing them with tf.trainable_variables()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM