简体   繁体   中英

Is this TF training curve overfitting or underfitting?

In the case of overfitting, to my knowledge the val_loss has to soar as opposed to the train_loss . But how about the case below ( val_loss remains low)? Is this model underfitting horribly? Or is it some completely different case? Previously my models would overfit badly so I added the dropout of 0.3 (4 CuDNNGRU layers with 64 neurons and one Dense layer and batchsize of 64), so should I reduce the dropout?

train_loss 与 validation_loss

This is neither overfitting nor underfitting. Some people refer to it as Unknown fit . Validation << training loss happens when you apply regularization (L1, L2, Dropout, ...) in keras because they are applied to training only and not on testing (validating). So it makes sense that your training loss is bigger (not all neurons are available for feed forward due to dropout for example).

But what is clear is that your model is not being optimized for your validation set, (almost a flat line). This can be due to many things:

  • Your validation set is badly representative of your dataset, has a very easy predictions or very small.
  • decrease learning rate or add more regularization (recurrent_regularization since you are using CuDNNGRU)
  • Your loss function is not appropriate for the problem you are trying to solve.

Hope these tips help you out.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM