简体   繁体   中英

How can I quickly check which tensorflow variables are updated during training and which are frozen?

I believe in a lot of scenarios, we need to freeze some layers in the tensorflow graph and keep other layers trainable.

Is there a method to quickly check if the network is trained as we expected? For example, the variables in the frozen layer are actually not updated during training.

I am using the following method to freeze all the variables in scope "ABC":

    with slim.arg_scope(inception.inceptionb_v2_arg_scope()):
        with tf.variable_scope('ABC'):
          _, end_points = getattr(inception, 'inception_v2'(..., is_training = False))
                         ......
    trainables = [v for v in tf.trainable_variables() if 'ABC/' not in v.name]
    optimizer = tf.train.AdamOptimizer().minimize(loss, var_list= trainables)

What is the suggested way to quickly confirm these variables are really not changed during training?

You can just check them after a couple of iterations:

frozen_variables = [v for v in tf.trainable_variables() if 'ABC/' in v.name]
tmp_frozen_variables_np = sess.run(frozen_variables)
# Training Code
assert np.allclose(tmp_frozen_variables_np, sess.run(frozen_variables))

However, as long as they are not in the var list for the optimizer you should be fine.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM