简体   繁体   English

在训练时期多次调用 Keras 回调

[英]Call Keras callback during training epoch more than once

I use Tensorflow Keras to train a neural network.我使用 Tensorflow Keras 来训练神经网络。 Currently I use the following callback to reduce the learning rate over the course of training:目前我使用以下回调来降低训练过程中的学习率:

def learning_rate_scheduler(lr, epoch):
    return lr * tf.math.exp(-0.1)

I use the callback as follows:我使用回调如下:

callback = tf.keras.callbacks.LearningRateScheduler(learning_rate_scheduler)
model.fit(x_train, y_train, epochs=10, callbacks=[callback], verbose=2)

This works as expected.这按预期工作。 With this approach, however, the learning rate is reduced only once every epoch.然而,使用这种方法,每个 epoch 只会降低一次学习率。 I would like to know how I can modify this callback so that it is called n times per epoch and not only once?我想知道如何修改此回调,以便每个时期调用n次,而不仅仅是一次? Is that possible?那可能吗?

To do this, you will need to create a custom callback so you have access to batch related methods.为此,您需要创建一个自定义回调,以便您可以访问与批处理相关的方法。 When you inherit from tf.keras.callbacks.Callback , you can override on_train_batch_end and set the learning rate on each batch.当您从tf.keras.callbacks.Callback继承时,您可以覆盖on_train_batch_end并设置每个批次的学习率。 If you want to do it every N steps, then you can just add a counter property and increment it every time on_train_batch_end is called.如果您想每N步执行一次,那么您只需添加一个counter属性并在每次调用on_train_batch_end时递增它。 Then, only set the learning rate if self.counter % N == 0 .然后,仅在self.counter % N == 0时设置学习率。 Some boilerplate code could look like this.一些样板代码可能看起来像这样。

class LearningRateSchedule(tf.keras.callbacks.Callback):
    def __init__(self, N):
        super(LearningRateShedule, self).__init__()
        self.N = N
    
    def on_train_begin(self, logs=None):
        self.step = 0

    def on_train_batch_end(self, batch, logs=None):
        self.step += 1
        lr = self.get_lr()
        if self.step % self.N == 0:
            # Set learning rate for model
            tf.keras.backend.set_value(self.model.optimizer.lr, lr)

    def get_lr(self):
        # Function to get learning rate
        return lr

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何在 tf.keras 训练期间获得当前时代的进度? - How to get progress of current epoch during tf.keras training? Keras:在训练期间和末期使用不同的验证AUROC - Keras: different validation AUROC during training and on epoch end 在 Keras 批量训练期间显示每个 epoch 的进度条 - Show progress bar for each epoch during batchwise training in Keras 如何在 n 个时期后调用回调,但始终在训练的最后一个时期? - How to call callback after n epochs but always in the last epoch of training? Keras + Elephas - 模型训练次数超过 nb_epoch 次 - Keras + Elephas - model trained more than nb_epoch times 如何在训练自动编码器(回调)期间将 keras 中的输入随机设置为零? - How to randomly set inputs to zero in keras during training autoencoder (callback)? 是否可以使用张量流回调将纪元结果记录在tf.keras模型中,以便在训练结束时保存? - Is it possible to log the epoch results in the tf.keras model using a tensorflow callback, in order to save at the end of training? Keras图像增强:如何选择“每个时期的步数”参数并在训练过程中包括特定的增强? - Keras image augmentation: How to choose “steps per epoch” parameter and include specific augmentations during training? tensorflow-keras 如何计算每个 epoch 的训练成本? - How does tensorflow-keras calculate the cost during training in each epoch? Keras警告:Epoch包含的不仅仅是`samples_per_epoch`样本 - Keras warning: Epoch comprised more than `samples_per_epoch` samples
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM