简体   繁体   English

如何在 scipy.optimize.least_squares 中同时返回 fun 和 jac

[英]How to return both fun and jac in scipy.optimize.least_squares

I am using scipy.optimize.least_squares to minimise a function of >40 parameters and for my particular problem I can express the jacobian of my function analytically.我正在使用scipy.optimize.least_squares来最小化 > 40 个参数的 function 并且对于我的特定问题,我可以分析地表达我的 ZC1C425268E68385D1AB5074C17A9 的雅可比。 However, the function I minimize and the jacobian share a lot of the same calculations.然而,我最小化的 function 和雅可比共享了很多相同的计算。 Is there a way to return both fun and jac from a single function?有没有办法从单个 function 中返回funjac I know this can be done when using scipy.optimize.minimize but haven't figured out a way to do it with scipy.optimize.least_squares .我知道在使用scipy.optimize.minimize时可以做到这一点,但还没有找到使用scipy.optimize.least_squares的方法。

There is no direct analog of fun_and_jac , it seems.似乎没有fun_and_jac的直接模拟。 A workaround is to refactor a common part of a calculation and use it in both callables.一种解决方法是重构计算的公共部分并在两个可调用对象中使用它。 For instance (here's a deliberately simplied example):例如(这是一个故意简化的例子):

In [8]: class F(object):
   ...:     def fun(self, x):
   ...:         return fun_rosenbrock(x)
   ...:     def jac(self, x):
   ...:         return jac_rosenbrock(x)
   ...:     

In [9]: x0_rosenbrock = np.array([2, 2])
   ...: f = F()
   ...: res_3 = least_squares(f.fun, x0_rosenbrock, f.jac)
   ...: res_3.x, res_3.cost
   ...: 
   ...: 
Out[9]: (array([1., 1.]), 0.0)

Here *_rosenbrock are taken from the docstring example of least_squares :这里*_rosenbrock取自 least_squares 的least_squares字符串示例

def fun_rosenbrock(x):
     return np.array([10 * (x[1] - x[0]**2), (1 - x[0])])

def jac_rosenbrock(x):
    return np.array([
        [-20 * x[0], 10],
        [-1, 0]])

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM