简体   繁体   English

fmincon找不到凸函数的全局最小值

[英]fmincon doesn't find a global minimum for a convex function

In my opinion, fmincon is a built-in function for local minimum in matlab. 我认为, fmincon是matlab中用于局部最小值的内置函数。 If the objective function is a convex problem, there is only one basin and the local minimum is the global minimum. 如果目标函数是凸问题,则只有一个盆地,而局部最小值是整体最小值。 While starting from different initial points in my experiment, the algorithm got different minimums function. 当从实验中的不同初始点开始时,该算法获得了不同的最小值函数。 I wonder if fmincon guarantees to be converged to a global minimum for convex problem. 我想知道fmincon保证收敛到凸问题的全局最小值。 If not, is there any other techiniques I can use for convex opimization as fast as possible? 如果没有,我是否可以使用其他任何技术来尽可能快地进行凸优化? Thanks. 谢谢。

PS fmincon use interior-point-method for searching minimum in default. PS fmincon默认使用interior-point-method搜索最小值。 Is this a normal problem for interior-point method , that is ,starting from different intial point, the method can get different global minimum for convex problem? 这是interior-point method的正常问题吗,也就是说,从不同的初始点开始,该方法对于凸问题可以获得不同的全局最小值?

EDIT: 编辑:

The objective is to minimize the sum of energy consumption by a group of users in a communication process, while the allocation of bandwidth is search. 目的是在搜索带宽分配的同时,最大程度地减少一组用户在通信过程中的能耗总和。 The transmission rate is 传输速率为

$r_k = x_k * log_2(1+\frac{g_k*p_k}{x_k})$

The optimization problem is as follow 优化问题如下

$min_{x} sum_k \frac{p_k*b_k}{r_k}$
s.t. $sum_k x_k \leq X_{max}$

The objective and constraints are all convex, thus this should be a convex optimization problem. 目标和约束都是凸的,因此这应该是凸的优化问题。

For programming code, it is just as follow, 对于编程代码,如下所示,

options = optimoptions('fmincon');
problem.options = options;
problem.solver = 'fmincon';
problem.objective = @(x) langBW(x, in_s, in_e, C1, a, p_ul);
problem.Aineq = ones(1,user_num);
problem.bineq = BW2;
problem.nonlcon = @(x) nonlConstr_bw(x,a,p_ul,T1,in_s,in_e,BW2);

problem.x0 = ones(user_num,1)
[b_ul,fval] = fmincon(problem);

langBW is the objective function, which is a convex function of x , the code of langBW is as follow, langBW是目标函数,它是x的凸函数, langBW的代码如下,

function fmin = langBW(x, in_s, in_e, C1, a, p_ul)
if size(x,1)<size(x,2)
    x = x';
end
b_ul = x;
r_ul = b_ul .* log2(1 + a.*p_ul./b_ul);

fmin = sum((in_s+in_e).*p_ul./r_ul) + sum(C1);

end

The nonlConstr_bw is the function of nonlinear constraints. nonlConstr_bw是非线性约束的函数。 It is shown as follow, 如下所示,

function [c,ceq] = nonlConstr_bw(x,a,p_ul,T1,in_s,in_e)
user_num = size(p_ul,1);
if size(x,1)<size(x,2)
    x = x';
end
b_ul = x;
r_ul = b_ul .* log2(1 + a.*p_ul./b_ul);

c1 = max(in_s./r_ul) + in_e./r_ul - T1;
c = c1;
ceq = zeros(user_num,1);

end

Except x , all other variables are supplied. 除了x ,还提供所有其他变量。 The problem is that when I set different problem.x0 , for example, when problem.x0=ones(user_num,1); 问题是当我设置不同的problem.x0 ,例如,当problem.x0=ones(user_num,1); , the solution of [b_ul,fval] = fmincon(problem); [b_ul,fval] = fmincon(problem); is different from that when problem.x0=2*ones(user_num,1); problem.x0=2*ones(user_num,1);时的情况不同problem.x0=2*ones(user_num,1); . That is what I am confused about. 那就是我的困惑。

fmincon uses the following algorithms : fmincon使用以下算法

'interior-point' (default)
'trust-region-reflective'
'sqp' (Sequential Quadratic Programming)
'sqp-legacy'
'active-set'

These methods will converge to a local minimia but not necessarily a global minimum. 这些方法将收敛到局部最小值,但不一定收敛到全局最小值。 Further minima may not be unique. 进一步的最小值可能不是唯一的。 The only way to guarantee a global minima is to search the whole solution space. 保证全局最小值的唯一方法是搜索整个解决方案空间。

From your comment, there appears to be only a signal minima? 根据您的评论,似乎只有信号极小值? (For example, a shifted parabola?) Then it should converge. (例如,抛物线移动了吗?)然后它应该收敛。

edit-- 编辑 -

Even if your function appears convex, the constraints can lead to multiple local minima . 即使您的函数看起来是凸的,约束也可能导致多个局部最小值 Sometimes this is called a "loosely" convex function 有时这称为“松散”凸函数

在此处输入图片说明

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM