简体   繁体   中英

How to change the learning rate based on the previous epoch accuracy using keras I am using an SGD optimizer?

This is the below code what I am trying to implement

def scheduler(epoch):
  init_lr=0.1
  #after every third epoch I am changing the learning rate
  if (epoch+1)%3==0:
    changed_lr=init_lr*(1-0.05)**epoch
    return changed_lr

  #I tried this to change the learning rate based on accuracy of previous epoch 
  #if the present epoch accuracy is less than previous epoch's accuracy
  else:
    changed_lr=init_lr-(0.1)*init_lr
  return changed_lr

If you want to change the learning rate in relation to number of epochs, use LearningRateScheduler :

import tensorflow as tf

def scheduler(epoch, lr):
  if epoch < 10:
    return lr
  else:
    return lr * tf.math.exp(-0.1)

model = <YOUR_MODEL>
model.compile(tf.keras.optimizers.SGD(), loss=<YOUR_LOSS>)

callback = tf.keras.callbacks.LearningRateScheduler(scheduler)
history = model.fit(X, y, epochs=15, callbacks=[callback])

If you want to change the learning rate in relation to some metric, use ReduceLROnPlateau :

callback = tf.keras.callbacks.ReduceLROnPlateau(
  monitor='acc',
  factor=0.6,
  patience=5,
  min_lr=3e-6,
)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM