简体   繁体   中英

Saving a stable model in keras (deep-learning)

I am trying to train a model. I am using the callback for the checkpoints which basically save the best model (with minimum loss function in validation). The problem I have is that sometimes this minimum is in the first epochs and the validation seems a little bit unstable. Is there a way to use checkpoints once the model is stable or after a certain number of epochs?

Here a picture of my training and validation curves:

输入图像说明

You can use a custom callback wherein, you can have a logical statement with your condition. If condition is met, you can call the ModelCheckpoint code from here https://github.com/fchollet/keras/blob/master/keras/callbacks.py#L316 .

If my words don't make sense, this code snippet will!

Thanks.

class ModifiedCheckpoint(keras.callbacks.Callback):
    def __init__(...):
        <copy code>        
        return
    def on_train_begin(self, logs={}):
        return

    def on_train_end(self, logs={}):
        return

    def on_epoch_begin(self, epoch, logs={}):
        return

    def on_epoch_end(self, epoch, logs={}):
        <insert your logic here>
        <and copy code here>
        return

    def on_batch_begin(self, batch, logs={}):
        return

    def on_batch_end(self, batch, logs={}):
        return

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM