简体   繁体   中英

Get Learning Rate from <tensorflow.python.keras.optimizer_v2.learning_rate_schedule.CosineDecay> Object

How can I get the value of the learning rate updated at each on_train_batch_begin ?

lr_decayed_fn = tf.keras.experimental.CosineDecay(initial_lr, decay_steps)
model.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=lr_decayed_fn))

I tried this way

def on_train_batch_begin (self, batch, logs = None):
    lr = K.get_value(self.model.optimizer.lr)

but I get <tensorflow.python.keras.optimizer_v2.learning_rate_schedule.CosineDecay object at 0x7f...>

When you set a function as a learning rate or an object subclassing LearningRateScheduler , you need to call that function (or Callable) with the current training step to get the learning rate. You can get the current training step by using the iterations attribute of the optimizer.

class CustomCallback(tf.keras.callbacks.Callback):
    def __init__(self) -> None:
        super().__init__()

    def on_train_batch_begin(self, batch, logs=None):
        lr = tf.keras.backend.get_value(
            self.model.optimizer.lr(self.model.optimizer.iterations)
        )

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM