[英]why scipy.optimize.basinhopping give different results
I need to find global minimum of a complex function.我需要找到复杂 function 的全局最小值。 I use basinhopping from scipy.optimize.
我使用来自 scipy.optimize 的盆地跳跃。 when I changed method, example method="nelder-mead" vs "L-BFGS-B" or initial guess x0, they gave me different results, especially in values of x which I need to get to use in next steps.
当我更改方法时,例如 method="nelder-mead" vs "L-BFGS-B" 或初始猜测 x0,它们给了我不同的结果,特别是在我需要在下一步中使用的 x 值中。 x[5] = 0.6 with "nelder-mead" but x[5]=0.0008 with "L-BFGS-B" although the function value is similar 2055.7795 vs 2055.7756 (all of these has "success: TRUE").
x[5] = 0.6,“nelder-mead”,但 x[5]=0.0008,“L-BFGS-B”,尽管 function 值与 2055.7795 与 2055.7756 相似(所有这些都具有“成功:TRUE”)。 I thought basinhopping finds global minimum.
我认为盆地跳跃可以找到全局最小值。 So it should give the same result, no matter what method or initial guess I use.
因此,无论我使用什么方法或初始猜测,它都应该给出相同的结果。 Anyone can explain why please?
任何人都可以解释为什么? and suggest what I should do to find global minimum and check if it is global (not local).
并建议我应该怎么做才能找到全局最小值并检查它是否是全局的(不是本地的)。
Thank you谢谢
The basin-hopping method is not guaranteed to give the global minimum for any function.跳盆方法不能保证为任何 function 提供全局最小值。 Also it is not deterministic, at there is a random component in the way it will explore the vicinty, as described in the help about
take_step
argument a它也不是确定性的,在探索附近的方式中有一个随机分量,如关于
take_step
参数a的帮助中所述
If you want to reproduce the same result in two different calls in addition to using the same method you must use the same seed
parameter.如果除了使用相同的方法之外,还想在两个不同的调用中重现相同的结果,则必须使用相同的
seed
参数。
Also using the same seed should increase the likelihood of giving the same result using different local optimizer methods.同样使用相同的种子应该会增加使用不同的局部优化器方法给出相同结果的可能性。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.