I want to solve a binary Linear Problem in C# using Microsoft Solver Foundation. I don't know why I get wrong answer. The Objective Value should be 41.1 but I get 213. The value of 5 variables should be 1 and the other ones should be 0. But I get many many variables with wrong values.
The sum of each row of the matrix should be <= 1. That are my constraints, and as you see in Constraint_arr I get the right constraints.
Thanks for any help.
Define decision variables:
SolverContext context = SolverContext.GetContext();
Model model = context.CreateModel();
Decision[,] x = new Decision[name_column.Length, 7];
for (int i = 0; i < name_column.Length; i++)
{
for (int j = 0; j < 7; j++)
{
x[i, j] = new Decision(Domain.IntegerRange(0,1), "x" + i + j);
}
}
for (int i = 0; i < name_column.Length; i++)
{
for (int j = 0; j < 7; j++)
{
model.AddDecisions(x[i, j]);
}
}
Create Constraint and add it to model:
Term[] Constraint_arr = new Term[name_column.Length];
Term tempC;
int jj;
for (int i = 0; i < name_column.Length; i++)
{
tempC = 0;
for (jj= 0; jj < 7; jj++)
{
if(vars_Matrix[i,jj] == 1)
{
tempC += x[i,jj];
}
}
Constraint_arr[i] = tempC;
model.AddConstraints("constraint" + i, Constraint_arr[i] <= 1);
}
Create Objective Function:
Term objective_Func = 0;
Term tempZ;
for (int i = 0; i < name_column.Length; i++)
{
tempZ = 0;
for (int j = 0; j < 7; j++)
{
tempZ += x[i, j] * ratio[i];
}
objective_Func+= tempZ;
}
model.AddGoal("Goal", GoalKind.Maximize, objective_Func);
print the answer:
Solution solution = context.Solve(new SimplexDirective());
Report report = solution.GetReport();
for (int i = 0; i < name_column.Length; i++)
{
for (int j = 0; j < 7; j++)
{
Console.Write(x[i, j]);
}
Console.WriteLine();
}
Console.Write("{0}", report);
Console.ReadLine();
The following MiniZinc model arrives at 14 as maximum value for the objective:
set of int: rows = 1..5;
set of int: cols = 1..7;
array[rows, cols] of 0..1: vars_Matrix = [|0, 0, 1, 0, 1, 1, 1
|0, 0, 1, 1, 0, 1, 1
|0, 0, 1, 0, 0, 0, 0
|0, 0, 1, 1, 0, 1, 1
|0, 0, 0, 0, 1, 0, 0|];
array[cols] of var 0..1: c;
var int: obj;
% constraint
% obj = sum(i in rows)(
% sum(j in cols) (
% c[i] * vars_Matrix[i, j]
% )
% );
constraint
obj = sum([ sum([ c[i] * vars_Matrix[i, j] | j in cols ]) | i in rows ]);
solve maximize(obj);
Output
c = array1d(1..7, [1, 1, 1, 1, 1, 1, 1]);
obj = 14;
The same result is obtained from the following Z3py model:
from z3 import *
s = Optimize()
Rows = range(5);
Cols = range(7);
vars_Matrix = [[0, 0, 1, 0, 1, 1, 1],
[0, 0, 1, 1, 0, 1, 1],
[0, 0, 1, 0, 0, 0, 0],
[0, 0, 1, 1, 0, 1, 1],
[0, 0, 0, 0, 1, 0, 0]]
c = [Int("c" + str(i+1)) for i in Rows]
obj = Int("obj")
for i in Rows:
s.add(c[i] >= 0, c[i] <= 1)
s.add(obj == Sum( [ Sum( [ c[i] * vars_Matrix[i][j] for j in Cols ] ) for i in Rows ] ))
s.maximize(obj)
if sat == s.check():
print(s.model())
else:
print("No solution. Sorry!")
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.