简体   繁体   中英

scipy.optimize.minimize fails to converge for matrix input with constraints

(First question, will edit if not good in some way. Did research prior to posting)

I want to predict x*C=y (x and y are datasets, C is a matrix), with a constraint that the rows of C sum to 1 and that its elements are between 0 and 1.

Because it's the rows that are constrained, not the columns, I can't just use linear regression and have to write down the error function. I did this successfully in Matlab, so I know it's not in the data or method, but probably in my code.

My code (below) gives one of these two errors (depending on the random initial guess, I assume):

More than 3*n iterations in LSQ subproblem    (Exit mode 3)
Inequality constraints incompatible    (Exit mode 4)

Any help would be greatly appreciated. I'm new to Python and spent a lot of time on this.

M1=data_2013.shape[1]
M2=data_2015.shape[1]

def error_function(C):
    C=C.reshape(M1,M2)
    return np.sum(np.sum((np.dot(data_2013,C)-data_2015)**2))

def between_zero_and_one(x):
    x=x.reshape(x.size)
    return x*(1-x)

def eq_constraint(x):
    x=x.reshape(M1,M2)
    return x.sum(axis=1) - 1

cons = [{'type': 'ineq', 'fun': between_zero_and_one}, 
        {'type': 'eq', 'fun': eq_constraint}]


C0=np.random.rand(M1,M2)
result=minimize(error_function,C0, constraints=cons, options={'disp': True, 'maxiter': 10000})

Sascha's answer helped me - the problem converged well with cvxpy .

Code:

M1=x_data.shape[1]
M2=y_data.shape[1]
C=cvx.Variable(x_data.shape[1],y_data.shape[1])
constraints=[0<=C, C<=1, cvx.sum_entries(C,axis=1)==1]
objective=cvx.Minimize(cvx.norm((x_data.values*C)-y_data.values))
prob=cvx.Problem(objective, constraints)
prob.solve()
C_mat=C.value

Thanks, Sascha!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM