简体   繁体   中英

How to find possible values bounds of a variable in linear programming with Python?

I have a linear problem defined by some variables and linear constraints and I would like to know the possible values interval of each variable.

For example, with variables a , b and c , and constraints a>b , b>c and a+b+c=100 , we have:

a in [33.33-100] b in [0-50] c in [0-33.33]

For now, my solution was to use Pulp 's linear programming solver, and for each variable to set it as optimization function to maximize, to have the upper bound, then to minimize to have its lower bound.

This makes me repeat the solving step twice per variable, which is probably not optimal.

Does someone know of a tool dedicated to find linear programming variables possible solution interval?

This is a very common topic, usually called bounds-tightening , very important in:

  • static pre-solving
  • iterative use in global-optimization

The algorithm you described is usually called optimization-based bounds-tightening and it's not that bad. The problem in your case is, that pulp does not allow you to act more low-level and use warm-starts to not need the full amount of iterations in every run.

Gleixner, Ambros M., et al. "Three enhancements for optimization-based bound tightening." Journal of Global Optimization 67.4 (2017): 731-757. for example starts with:

Optimization-based bound tightening (OBBT) is one of the most effective procedures to reduce variable domains of nonconvex mixed-integer nonlinear programs (MINLPs). At the same time it is one of the most expensive bound tightening procedures, since it solves auxiliary linear programs (LPs)—up to twice the number of variables many. The main goal of this paper is to discuss algorithmic techniques for an efficient implementation of OBBT.

There are non-LP technology (eg interval-arithmetic) alternatives like feasibility-based bounds tightening .

See for example:

  • Belotti, Pietro, et al. "Feasibility-based bounds tightening via fixed points." International Conference on Combinatorial Optimization and Applications. Springer, Berlin, Heidelberg, 2010.

You know some keywords to search now. Maybe the following is a good start (did not read it though):

  • Puranik, Yash, and Nikolaos V. Sahinidis. "Domain reduction techniques for global NLP and MINLP optimization." Constraints 22.3 (2017): 338-376.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM