簡體   English   中英

sgd中的打印學習率評估時代

[英]print learning rate evary epoch in sgd

我嘗試在小批量梯度下降中打印學習率。 但是Ir在許多時期保持不變(始終為0.10000000149)。 但這被認為可以改變evrery mini-batch。 代碼如下:

# set the decay as 1e-1 to see the Ir change between epochs.
sgd = SGD(lr=0.1, decay=1e-1, momentum=0.9, nesterov=True)
model.compile(loss='categorical_crossentropy',
              optimizer=sgd,
              metrics=['accuracy'])
class LossHistory(Callback):
    def on_epoch_begin(self, batch, logs={}):
        lr=self.model.optimizer.lr.get_value()
        print('Ir:', lr)
history=LossHistory()
model.fit(X_train, Y_train,
          batch_size= batch_size,
          nb_epoch= nb_epoch,
          callbacks= [history])

您要打印的是初始學習率,而不是實際計算得出的實際學習率:

lr = self.lr * (1. / (1. + self.decay * self.iterations))
from keras import backend as K
from keras.callbacks import Callback


class SGDLearningRateTracker(Callback):
    def on_epoch_end(self, epoch, logs={}):
        optimizer = self.model.optimizer
        lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
        print('\nLR: {:.6f}\n'.format(lr))

然后在模型中添加回調:

model.fit(X_train, Y_train_cat, nb_epoch=params['n_epochs'], batch_size=params['batch_size'], validation_split=0.1,callbacks=[SGDLearningRateTracker()])

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM