简体   繁体   中英

Fine-tuning a deep neural network in Tensorflow

I want to partially fine-tune a pre-trained deep neural network in Tensorflow (as in, load weights for all layers but only update weights on higher level layers).

Is there any method in Tensorflow that allows the selection of variables that should be changed and those that should be kept the same?

Thank you in advance!

When you create an optimizer (eg tf.train.AdagradOptimizer ) to train your model, you can pass an explicit var_list=[...] argument to the Optimizer.minimize() method. (If you don't specify this list, it will default to containing all of the variables in tf.trainable_variables() .)

For example, depending on your model, you might be able to use the names of your variables to define the list of variables to be optimized:

# Assuming all variables to be fine-tuned have a name that starts with
# "layer17/".
opt_vars = [v for v in tf.trainable_variables() if v.name.startswith("layer17/")]

train_op = optimizer.minimize(loss, var_list=opt_vars)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM