[英]What if we have very few training examples for a deep learning model, will employing more number of epochs help the model have better accuracy?
Considering, i have 5 training example under the label 'dog' and 5 under the label 'cat'.考虑到,我在 label 'dog' 下有 5 个训练示例,在 label 'cat' 下有 5 个训练示例。 Will more number of epochs help me train a Deep Learning Model with a good accuracy?
更多的 epoch 是否会帮助我以良好的准确度训练深度学习 Model?
I would advise you to look into the topic of overfitting/underfitting.我建议您研究过拟合/欠拟合的主题。
Generally, if you train for more epochs, you will at a certain point start overfitting.一般来说,如果你训练更多的 epoch,你会在某个点开始过度拟合。 So more epochs will lead to a better performance on the training set, but a worse performance on any other set (generalization error).
因此,更多的 epoch 将导致在训练集上的表现更好,但在任何其他集上的表现更差(泛化错误)。 This is why most deep-learning models use a validation set for early stopping:
这就是为什么大多数深度学习模型使用验证集进行提前停止的原因:
A general idea is: fit to training set for one epoch check if validation set got predicted worse (if yes, reduce patience) if patience is 0 stop and use last mode, where validation got better一个一般的想法是:适合一个时期的训练集 检查验证集是否预测得更糟(如果是,减少耐心)如果耐心为 0 停止并使用最后一个模式,验证变得更好
If you have very little data, you should probably use leave-n-out cross-validation instead of simple train/valid/test split.如果您的数据很少,您可能应该使用 leave-n-out 交叉验证而不是简单的训练/有效/测试拆分。
Short answer: More epochs will help you perform better on the training data, but might (will) lead to worse performance on any new data.简短的回答:更多的 epoch 将帮助您在训练数据上表现更好,但可能(将)导致在任何新数据上的表现更差。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.