简体   繁体   中英

why scipy.optimize.basinhopping give different results

I need to find global minimum of a complex function. I use basinhopping from scipy.optimize. when I changed method, example method="nelder-mead" vs "L-BFGS-B" or initial guess x0, they gave me different results, especially in values of x which I need to get to use in next steps. x[5] = 0.6 with "nelder-mead" but x[5]=0.0008 with "L-BFGS-B" although the function value is similar 2055.7795 vs 2055.7756 (all of these has "success: TRUE"). I thought basinhopping finds global minimum. So it should give the same result, no matter what method or initial guess I use. Anyone can explain why please? and suggest what I should do to find global minimum and check if it is global (not local).

Thank you

The basin-hopping method is not guaranteed to give the global minimum for any function. Also it is not deterministic, at there is a random component in the way it will explore the vicinty, as described in the help about take_step argument a

If you want to reproduce the same result in two different calls in addition to using the same method you must use the same seed parameter.

Also using the same seed should increase the likelihood of giving the same result using different local optimizer methods.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM