简体   繁体   中英

Ineq and eq constraints with scipy.optimize.minimize()

I am attempting to understand the behavior of the constraints in scipy.optimize.minimize :

First, I create 4 assets and 100 scenarios of returns. The average returning funds are in order best to worse D > B > A > C

#seed first 
np.random.seed(1)

df_returns = pd.DataFrame(np.random.rand(100,4) - 0.25, columns =list('ABCD'))
df_returns.head()

    A           B           C           D
0   0.167022    0.470324    -0.249886   0.052333
1   -0.103244   -0.157661   -0.063740   0.095561
2   0.146767    0.288817    0.169195    0.435220
3   -0.045548   0.628117    -0.222612   0.420468
4   0.167305    0.308690    -0.109613   -0.051899

and a set of weights

weights = pd.Series([0.25, 0.25, 0.25, 0.25], index=list('ABCD'))

    0
A   0.25
B   0.25
C   0.25
D   0.25

we create an objective function:

def returns_objective_function(weights, df_returns):
    result = -1. * (df_returns * weights).mean().sum()
    return result

and constraints and bounds

cons = ({'type': 'eq', 'fun': lambda weights: np.sum(weights) -1  })
bnds = ((0.01, .8), (0.01, .8), (0.01, .8), (0.01, .75))

Let's optimize

optimize.minimize(returns_objective_function, weights, (df_returns),
                              bounds=bnds, constraints=cons, method= 'SLSQP')

And we get success.
  status: 0
 success: True
    njev: 8
    nfev: 48
     fun: -0.2885398923185326
       x: array([ 0.01,  0.23,  0.01,  0.75])
 message: 'Optimization terminated successfully.'
     jac: array([-0.24384782, -0.2789166 , -0.21977262, -0.29300382,  0.        ])
     nit: 8

Now I wish to add constraints starting with a basic inequality:

scipy.optimize.minimize documentation states

Equality constraint means that the constraint function result is to be zero whereas inequality means that it is to be non-negative.

cons = ( 
        {'type': 'eq', 'fun': lambda weights: np.sum(weights) -1  }
        ,{'type': 'ineq', 'fun': lambda weights: np.sum(weights) + x}
)

Depending on x, I get unexpected behavior.

x = -100

Based on the bounds, weights can be a maximum of 3.15 and, of course, must sum to 1 by the first equality constraint np.sum(weights) - 1 , but, as a result, np.sum(weights) + x would always be negative. I believe no solution should be found, yet scipy.optimize.minimize returns success.

With a simpler model I get the same behavior:

x = [1,2] 
optimize.minimize(
    lambda x: x[0]**2+x[1]**2, 
    x, 
    constraints = (
        {'type':'eq','fun': lambda x: x[0]+x[1]-1},
        {'type':'ineq','fun': lambda x: x[0]-2}
                  ),
    bounds = ((0,None),(0,None)),
    method='SLSQP')

with results:

   nfev: 8
    fun: 2.77777777777712
    nit: 6
    jac: array([  3.33333334e+00,   2.98023224e-08,   0.00000000e+00])
      x: array([  1.66666667e+00,   1.39888101e-14])
success: True
message: 'Optimization terminated successfully.'
 status: 0
   njev: 2

There should be some flag that this is an infeasible solution.

SLSQP is also available from R:

> slsqp(c(1,2),
+       function(x) {x[1]^2+x[2]^2},
+       heq=function(x){x[1]+x[2]-1},
+       hin=function(x){x[1]-2},
+       lower=c(0,0))
$par
[1] 1.666667e+00 4.773719e-11

$value
[1] 2.777778

$iter
[1] 105

$convergence
[1] -4

$message
[1] "NLOPT_ROUNDOFF_LIMITED: Roundoff errors led to a breakdown of the optimization algorithm. In this case, the returned minimum may still be useful. (e.g. this error occurs in NEWUOA if one tries to achieve a tolerance too close to machine precision.)"

At least we see some warning signals here.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM