[英]Why my scipy.optimize.minimize fails?
I try with fmin_bfgs to find the local minimum of the absolute function abs(x)
. 我尝试使用fmin_bfgs来找到绝对函数abs(x)
的局部最小值。 The initial point is set to 100.0; 初始点设置为100.0。 the expected answer is 0.0. 预期的答案是0.0。 However, I get: 但是,我得到:
In [184]: op.fmin_bfgs(lambda x:np.abs(x),100.0)
Warning: Desired error not necessarily achieved due to precision loss.
Current function value: 100.000000
Iterations: 0
Function evaluations: 64
Gradient evaluations: 20
Out[184]: array([100.0])
Why? 为什么?
Methods like fmin_bfgs and fmin_slsqp require smooth (continuous derivative) functions in order to provide reliable results. 像fmin_bfgs和fmin_slsqp之类的方法需要平滑(连续导数)函数才能提供可靠的结果。 abs(x) has a dicontinuous derivative at its minimum. abs(x)的最小值为双连续导数。 A method like the Nelder-Mead simplex, which doesn't require continuous derivatives, might provide better results in this case. 在这种情况下,不需要连续导数的Nelder-Mead单纯形法之类的方法可能会提供更好的结果。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.