[英]Call Keras callback during training epoch more than once
I use Tensorflow Keras to train a neural network.我使用 Tensorflow Keras 来训练神经网络。 Currently I use the following callback to reduce the learning rate over the course of training:
目前我使用以下回调来降低训练过程中的学习率:
def learning_rate_scheduler(lr, epoch):
return lr * tf.math.exp(-0.1)
I use the callback as follows:我使用回调如下:
callback = tf.keras.callbacks.LearningRateScheduler(learning_rate_scheduler)
model.fit(x_train, y_train, epochs=10, callbacks=[callback], verbose=2)
This works as expected.这按预期工作。 With this approach, however, the learning rate is reduced only once every epoch.
然而,使用这种方法,每个 epoch 只会降低一次学习率。 I would like to know how I can modify this callback so that it is called
n
times per epoch and not only once?我想知道如何修改此回调,以便每个时期调用
n
次,而不仅仅是一次? Is that possible?那可能吗?
To do this, you will need to create a custom callback so you have access to batch related methods.为此,您需要创建一个自定义回调,以便您可以访问与批处理相关的方法。 When you inherit from
tf.keras.callbacks.Callback
, you can override on_train_batch_end
and set the learning rate on each batch.当您从
tf.keras.callbacks.Callback
继承时,您可以覆盖on_train_batch_end
并设置每个批次的学习率。 If you want to do it every N
steps, then you can just add a counter
property and increment it every time on_train_batch_end
is called.如果您想每
N
步执行一次,那么您只需添加一个counter
属性并在每次调用on_train_batch_end
时递增它。 Then, only set the learning rate if self.counter % N == 0
.然后,仅在
self.counter % N == 0
时设置学习率。 Some boilerplate code could look like this.一些样板代码可能看起来像这样。
class LearningRateSchedule(tf.keras.callbacks.Callback):
def __init__(self, N):
super(LearningRateShedule, self).__init__()
self.N = N
def on_train_begin(self, logs=None):
self.step = 0
def on_train_batch_end(self, batch, logs=None):
self.step += 1
lr = self.get_lr()
if self.step % self.N == 0:
# Set learning rate for model
tf.keras.backend.set_value(self.model.optimizer.lr, lr)
def get_lr(self):
# Function to get learning rate
return lr
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.