[英]Python correct use of scipy.optimize.minimize
I am trying to perform minimization of the following function: 我正在尝试最小化以下功能:
def mvqr(P, y, x, c):
s = 0
for i in xrange(1, len(y)):
summation = numpy.linalg.norm(numpy.dot(numpy.linalg.inv(P), (y[i,:] - numpy.dot(beta, x[i,:])))) + numpy.dot(numpy.dot(c.T, linalg.inv(P)), (y[i,:] - numpy.dot(beta, x[i,:])))
s = s + summation
return s
this are the lines of the main file: 这是主文件的各行:
fun = lambda beta: mvqr(E, Y_x, X_x, v)
result = minimize(fun, beta0, method = 'BFGS')
beta is the unknown variable of the function mvqr()
and beta0
is the initial guess, a (2,2)
array I have previously calculated. beta是函数
mvqr()
的未知变量, beta0
是初始猜测,这是我之前计算的(2,2)
数组。
I got an error: 我收到一个错误:
NameError: global name 'beta' is not defined
. NameError: global name 'beta' is not defined
。
For who is wondering if the file of the function mvqr()
has already been located in the directory of the python packages, the answer is: yes, it has. 对于谁想知道函数
mvqr()
的文件是否已经位于python软件包的目录中,答案是:是的,它已经存在。
I think the problem is with beta
in the mvqr()
function and the use of lambda
function. 我认为问题出在
mvqr()
函数中的beta
和lambda
函数的使用上。
Any help? 有什么帮助吗?
EDIT 编辑
Thanks to pv. 多亏了光伏。 the code now compiles with no error but when perform minimization does not iterate since the output of the function
minimize
displays the message 'Optimization terminated successfully.'
现在,代码可以编译,没有错误,但是执行最小化操作时不会进行迭代,因为最小化函数的输出
minimize
显示消息'Optimization terminated successfully.'
but simply does not iterate and returns the initial guess. 但只是不进行迭代并返回初始猜测。
status: 0
success: True
njev: 1
nfev: 6
hess_inv: array([[1, 0, 0, 0],
[0, 1, 0, 0],
[0, 0, 1, 0],
[0, 0, 0, 1]])
fun: 1.2471261924040662e+31
x: array([ 3.44860608e+13, -4.10768809e-02, -1.42222910e+15,
-1.22803296e+00])
message: 'Optimization terminated successfully.'
jac: array([ 0., 0., 0., 0.])
I have also tried with scipy.optimize.fmin_bfgs
but the result is pretty the same: 我也尝试过
scipy.optimize.fmin_bfgs
但结果却几乎相同:
Optimization terminated successfully.
Current function value: 937385449919245008057547138533569682802290504082509386481664.000000
Iterations: 0
Function evaluations: 6
Gradient evaluations: 1
It could be that unfortunately beta0
is a local minimum or however a stationary point as holds jac == [0, 0, 0, 0]
and therefore the algorithm terminates, but it looks strange to me that the initial guess is the minimum of the function (even if a local one). 不幸的是,
beta0
可能是局部最小值,或者是固定点jac == [0, 0, 0, 0]
beta0
jac == [0, 0, 0, 0]
,因此算法终止了,但是对于我来说,最初的猜测是功能(即使是本地功能)。 Does anyone have idea of how to avoid it? 有谁知道如何避免它?
Any help would be appreciated. 任何帮助,将不胜感激。
Change definition to def mvqr(beta, P, y, x, c):
and do fun = lambda beta: mvqr(beta.reshape(2,2), E, Y_x, X_x, v)
and minimize(fun, beta0.ravel())
if you wish to optimize value of beta
that is a 2x2 matrix. 将定义更改为
def mvqr(beta, P, y, x, c):
并执行fun = lambda beta: mvqr(beta.reshape(2,2), E, Y_x, X_x, v)
并minimize(fun, beta0.ravel())
如果您希望优化2x2矩阵的beta
值。
After that, consider reading a Python tutorial, esp. 之后,请考虑阅读Python教程,特别是。 on global and local variables.
关于全局和局部变量。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.