I am trying to optimize a function of a small number of variables (somewhere from 2 to 10). What I am trying to do is calculate the minimum of the function on a bounded hypercube
[0,1] x [0,1] x ... x [0,1]
The calculation of the function, its gradient and its hessian is al relatively simple, quick and accurate.
Now, my problem is this:
Using scipy
, I can use either scipy.optimize.minimize(..., method='Newton-CG')
or scipy.optimize.minimize(..., method='TNC')
to calculate the minimum of the function, however:
Is there any method that will use both?
Here are a couple of alternatives:
Mystic , a framework which enables constraint optimization by using external constraints (I think, Lagrange multipliers). The package uses scipy.optimize, so it should be possible to use Scipy`s methods with additional constraints.
Ipopt, and its python bindings PyIpopt and CyIpopt . You could look into openopt .
Usually developed for curve fitting, lmfit provides the possibility to add external constraints. It has most solvers from scipy included.
l-bfgs-b does a bounded optimisation. Like any quasi-Newton method it approximates the Hessian. But this is often better than using the real Hessian.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.