简体   繁体   中英

Call Keras callback during training epoch more than once

I use Tensorflow Keras to train a neural network. Currently I use the following callback to reduce the learning rate over the course of training:

def learning_rate_scheduler(lr, epoch):
    return lr * tf.math.exp(-0.1)

I use the callback as follows:

callback = tf.keras.callbacks.LearningRateScheduler(learning_rate_scheduler)
model.fit(x_train, y_train, epochs=10, callbacks=[callback], verbose=2)

This works as expected. With this approach, however, the learning rate is reduced only once every epoch. I would like to know how I can modify this callback so that it is called n times per epoch and not only once? Is that possible?

To do this, you will need to create a custom callback so you have access to batch related methods. When you inherit from tf.keras.callbacks.Callback , you can override on_train_batch_end and set the learning rate on each batch. If you want to do it every N steps, then you can just add a counter property and increment it every time on_train_batch_end is called. Then, only set the learning rate if self.counter % N == 0 . Some boilerplate code could look like this.

class LearningRateSchedule(tf.keras.callbacks.Callback):
    def __init__(self, N):
        super(LearningRateShedule, self).__init__()
        self.N = N
    
    def on_train_begin(self, logs=None):
        self.step = 0

    def on_train_batch_end(self, batch, logs=None):
        self.step += 1
        lr = self.get_lr()
        if self.step % self.N == 0:
            # Set learning rate for model
            tf.keras.backend.set_value(self.model.optimizer.lr, lr)

    def get_lr(self):
        # Function to get learning rate
        return lr

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM