简体   繁体   English

当你想要计算梯度和目标函数时,如何使用scipy.optimize.minimize函数?

[英]How to use scipy.optimize.minimize function when you want to compute gradient along with the objective function?

scipy.optimize.minimze takes obj and jac functions as input. scipy.optimize.minimzeobjjac函数作为输入。 and I believe it will call them separately as and when needed. 我相信它会在需要时单独调用它们。 But more often than not we come across objective functions whose gradient computation shares a lot of computations from the objective function. 但是,我们经常遇到目标函数,其梯度计算与目标函数共享大量计算。 So ideally I would like to compute the obj and grad simultaneously. 理想情况下,我想同时计算objgrad But this doesn't seem to be the case with this library? 但这个图书馆的情况似乎并非如此? What is the way to deal with it if one still wants to use scipy.optimize.minimze if at all there is? 如果仍然想要使用scipy.optimize.minimze那么处理它的方法是什么?

You totally can. 你完全可以。 Just use jac=True : 只需使用jac=True

In [1]: import numpy as np

In [2]: from scipy.optimize import minimize

In [3]: def f_and_grad(x):
   ...:     return x**2, 2*x
   ...: 

In [4]: minimize(f_and_grad, [1], jac=True)
Out[4]: 
      fun: 1.8367099231598242e-40
 hess_inv: array([[ 0.5]])
      jac: array([  2.71050543e-20])
  message: 'Optimization terminated successfully.'
     nfev: 4
      nit: 2
     njev: 4
   status: 0
  success: True
        x: array([  1.35525272e-20])

It's actually documented : 它实际上记录在案

jac : bool or callable, optional Jacobian (gradient) of objective function. jac:bool或callable,可选雅可比(渐变)的目标函数。 Only for CG, BFGS, Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. 仅适用于CG,BFGS,Newton-CG,L-BFGS-B,TNC,SLSQP,dogleg,trust-ncg。 If jac is a Boolean and is True, fun is assumed to return the gradient along with the objective function. 如果jac是布尔值并且为True,则假定fun会返回梯度以及目标函数。 If False, the gradient will be estimated numerically. 如果为False,将以数字方式估计梯度。 jac can also be a callable returning the gradient of the objective. jac也可以是一个可调用的,返回目标的梯度。 In this case, it must accept the same arguments as fun. 在这种情况下,它必须接受与fun相同的参数。

(emphasis mine) (强调我的)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Scipy.optimize.minimize目标函数ValueError - Scipy.optimize.minimize objective function ValueError 如何将 scipy.optimize.minimize 用于具有 3 个变量的 function? - How to use scipy.optimize.minimize for function with 3 variables? 如何正确使用scipy.optimize.minimize函数返回整数 - How to properly use scipy.optimize.minimize for function returning integer 将约束应用于scipy.optimize.minimize?中的目标函数? 如0 &lt;=目标&lt;= 1 - Apply constraints to the objective function in scipy.optimize.minimize? Such as 0 <= objective <= 1 Scipy.Optimize.Minimize 低效? 两次调用成本/梯度 function - Scipy.Optimize.Minimize inefficient? Double calls to cost/gradient function 当我使用 scipy.optimize.minimize() 最小化它时,为什么我的目标 function 的矩阵参数发生了变化? - Why is a matrix argument of my objective function changed when I minimize it with scipy.optimize.minimize()? Scipy.optimize.minimize 目标 function 必须返回一个标量 - Scipy.optimize.minimize Objective function must return a scalar scipy.optimize.minimize 跟踪目标函数 - scipy.optimize.minimize keep track of objective function 在计算梯度需要函数评估时有效地使用 Scipy.Optimize.Minimize - Using Scipy.Optimize.Minimize efficiently when compting gradient requires function evaluation scipy.optimize.minimize:一起计算粗麻布和渐变 - scipy.optimize.minimize : compute hessian and gradient together
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM