简体   繁体   中英

Multiply decision variable with index used in it

I am writing an optimization modell in python that I will solve with Gurobi. However I have got one problem with one constraint. I want to multiply my decision variable with the index of j:
x[i,j] * j , with j as the index of j used in x[i,j] .
i and j are both are modeled as lists. The decisionvariable x[i,j] is binary.

I tried with

for i in I:
  m.addConstr (x[i,j]*J.index(j))

But this will always take the last element of the list j. How can I make the constraint to take in the index of j used in x[i,j] ?

if j is integer or continus, you can directly do x[i,j]*j, if not, your code is seem correct. i think that you want to write this :

for j in J:
   for i in I:
      m.addConstr(x[i,j]*J.index(j))

or

for i in I:
   m.addConstr(x[i,j]*J.index(j) for j in J)

Can you give détails ?

You can also use range like this:

for i in I :
   m.add(x[i,J[k]]*k for k in range(len(J))

or

for k in range(len(J)):
   for i in I:
       m.addConstr(x[i,J[k]]*k)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM