简体   繁体   中英

fmincon doesn't find a global minimum for a convex function

In my opinion, fmincon is a built-in function for local minimum in matlab. If the objective function is a convex problem, there is only one basin and the local minimum is the global minimum. While starting from different initial points in my experiment, the algorithm got different minimums function. I wonder if fmincon guarantees to be converged to a global minimum for convex problem. If not, is there any other techiniques I can use for convex opimization as fast as possible? Thanks.

PS fmincon use interior-point-method for searching minimum in default. Is this a normal problem for interior-point method , that is ,starting from different intial point, the method can get different global minimum for convex problem?

EDIT:

The objective is to minimize the sum of energy consumption by a group of users in a communication process, while the allocation of bandwidth is search. The transmission rate is

$r_k = x_k * log_2(1+\frac{g_k*p_k}{x_k})$

The optimization problem is as follow

$min_{x} sum_k \frac{p_k*b_k}{r_k}$
s.t. $sum_k x_k \leq X_{max}$

The objective and constraints are all convex, thus this should be a convex optimization problem.

For programming code, it is just as follow,

options = optimoptions('fmincon');
problem.options = options;
problem.solver = 'fmincon';
problem.objective = @(x) langBW(x, in_s, in_e, C1, a, p_ul);
problem.Aineq = ones(1,user_num);
problem.bineq = BW2;
problem.nonlcon = @(x) nonlConstr_bw(x,a,p_ul,T1,in_s,in_e,BW2);

problem.x0 = ones(user_num,1)
[b_ul,fval] = fmincon(problem);

langBW is the objective function, which is a convex function of x , the code of langBW is as follow,

function fmin = langBW(x, in_s, in_e, C1, a, p_ul)
if size(x,1)<size(x,2)
    x = x';
end
b_ul = x;
r_ul = b_ul .* log2(1 + a.*p_ul./b_ul);

fmin = sum((in_s+in_e).*p_ul./r_ul) + sum(C1);

end

The nonlConstr_bw is the function of nonlinear constraints. It is shown as follow,

function [c,ceq] = nonlConstr_bw(x,a,p_ul,T1,in_s,in_e)
user_num = size(p_ul,1);
if size(x,1)<size(x,2)
    x = x';
end
b_ul = x;
r_ul = b_ul .* log2(1 + a.*p_ul./b_ul);

c1 = max(in_s./r_ul) + in_e./r_ul - T1;
c = c1;
ceq = zeros(user_num,1);

end

Except x , all other variables are supplied. The problem is that when I set different problem.x0 , for example, when problem.x0=ones(user_num,1); , the solution of [b_ul,fval] = fmincon(problem); is different from that when problem.x0=2*ones(user_num,1); . That is what I am confused about.

fmincon uses the following algorithms :

'interior-point' (default)
'trust-region-reflective'
'sqp' (Sequential Quadratic Programming)
'sqp-legacy'
'active-set'

These methods will converge to a local minimia but not necessarily a global minimum. Further minima may not be unique. The only way to guarantee a global minima is to search the whole solution space.

From your comment, there appears to be only a signal minima? (For example, a shifted parabola?) Then it should converge.

edit--

Even if your function appears convex, the constraints can lead to multiple local minima . Sometimes this is called a "loosely" convex function

在此处输入图片说明

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM