[英]Gradient descent self code: loss increasing gradually while optimizing -python
I'm trying to estimate the two parameter of the following exponential decay.我正在尝试估计以下指数衰减的两个参数。
but the error(loss) is gradually increasing.但错误(损失)正在逐渐增加。
I tried smaller learning_rate , from 10 ^-2 to 10 ^-10我尝试了更小的 learning_rate ,从 10 ^-2 到 10 ^-10
calculate differential again,再次计算微分,
tried with different data set.尝试使用不同的数据集。
The parameters didn't bounce.参数没有反弹。 Just steadily changing ie increasing or decreasing only.
只是稳步变化,即只增加或减少。
code and the data is here: https://github.com/psmuler/temp.git代码和数据在这里: https://github.com/psmuler/temp.git
What is wrong with the code?代码有什么问题? If I change the minus of
如果我改变减号
tau - dif_tau/len(data), b - dif_b/len(data)
in the line 35 into plus(+), it worked.在 plus(+) 的第 35 行,它起作用了。 But surely this is not the solution.
但这肯定不是解决方案。
Maybe wrong with the partial differentiation.偏微分可能是错误的。
or do I just misunderstand the very basis?
还是我只是误解了基础?
Thank you.谢谢你。
If I change the minus of
如果我改变减号
tau - dif_tau/len(data), b - dif_b/len(data)
in the line 35 into plus(+), it worked.在 plus(+) 的第 35 行,它起作用了。 But surely this is not the solution.
但这肯定不是解决方案。
I've got tau = 1291.352909 b = 0.14934105 on the data set 1_7, which correspond quite well.我在数据集 1_7 上有 tau = 1291.352909 b = 0.14934105,它们对应得很好。
You forgot to put -
before time
.你忘了提前
time
-
def dif_f0_b(time, tau, b):
return (-1)*math.exp(time/tau) + 1
must be一定是
def dif_f0_b(time, tau, b):
return (-1)*math.exp(-time/tau) + 1
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.