简体   繁体   English

神经网络,随着时间的推移值得改变学习率和动力

[英]Neural network, is it worth changing learning rate and momentum over time

Is it worth to change learning rate after certain conditions are met? 在满足某些条件后,是否值得改变学习率? And how and why to do it? 以及如何以及为什么这样做? For example net will start with high learning rate and after squared error is low enough learning rate will drop for better precision or learning rate should increase to jump-out of local minima?. 例如,net将以高学习率开始,并且在平方误差足够低之后,学习率将下降以获得更好的精度,或者学习率应该增加到跳出局部最小值? Wouldn't it cause over-fitting? 它不会导致过度拟合吗? And what about momentum? 动量怎么样?

Usually you should start with a high learning rate and a low momentum. 通常你应该从高学习率和低动力开始。 Then you decrease the learning rate over time and increase the momentum. 然后随着时间的推移降低学习率并增加动力。 The idea is to allow more exploration at the beginning of the learning and force convergence at the end of the learning. 这个想法是在学习开始时允许更多的探索,并在学习结束时强制收敛。 Usually you should look at the training error to set up your learning schedule: if it got stuck, ie the error does not change, it is time to decrease your learning rate. 通常你应该查看训练错误来设置你的学习计划:如果它被卡住,即错误没有改变,是时候降低你的学习率了。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM