[英]How to use scipy.optimize.minimize function when you want to compute gradient along with the objective function?
scipy.optimize.minimze
takes obj
and jac
functions as input. scipy.optimize.minimze
将obj
和jac
函数作为输入。 and I believe it will call them separately as and when needed. 我相信它会在需要时单独调用它们。 But more often than not we come across objective functions whose gradient computation shares a lot of computations from the objective function.
但是,我们经常遇到目标函数,其梯度计算与目标函数共享大量计算。 So ideally I would like to compute the
obj
and grad
simultaneously. 理想情况下,我想同时计算
obj
和grad
。 But this doesn't seem to be the case with this library? 但这个图书馆的情况似乎并非如此? What is the way to deal with it if one still wants to use
scipy.optimize.minimze
if at all there is? 如果仍然想要使用
scipy.optimize.minimze
那么处理它的方法是什么?
You totally can. 你完全可以。 Just use
jac=True
: 只需使用
jac=True
:
In [1]: import numpy as np
In [2]: from scipy.optimize import minimize
In [3]: def f_and_grad(x):
...: return x**2, 2*x
...:
In [4]: minimize(f_and_grad, [1], jac=True)
Out[4]:
fun: 1.8367099231598242e-40
hess_inv: array([[ 0.5]])
jac: array([ 2.71050543e-20])
message: 'Optimization terminated successfully.'
nfev: 4
nit: 2
njev: 4
status: 0
success: True
x: array([ 1.35525272e-20])
It's actually documented : 它实际上记录在案 :
jac : bool or callable, optional Jacobian (gradient) of objective function.
jac:bool或callable,可选雅可比(渐变)的目标函数。 Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg.
仅适用于CG,BFGS,Newton-CG,L-BFGS-B,TNC,SLSQP,dogleg,trust-ncg。 If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function.
如果jac是布尔值并且为True,则假定fun会返回梯度以及目标函数。 If False, the gradient will be estimated numerically.
如果为False,将以数字方式估计梯度。 jac can also be a callable returning the gradient of the objective.
jac也可以是一个可调用的,返回目标的梯度。 In this case, it must accept the same arguments as fun.
在这种情况下,它必须接受与fun相同的参数。
(emphasis mine) (强调我的)
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.