Princeton Handbook of Test Problems: Test 9.3.4

This example is from the book Princeton Handbook of Test Problems in Local and Global Optimization Dempe, Chapter 9.3.4 -parg 223 url

Here, only the second level is described

Model of the problem First level

\[\min 2x_1+2x_2-3y_1-3y_2-60,\\ \notag s.t.\\ x_1 + x_2 + y_1 -2y_2 -40\leq 0,\\ 0 \leq x_i \leq 50, \forall i \in I,\\ -10 \leq y_j \leq 20, \forall j \in J,\\\]

Second level

\[\min (-x_1 + y_1 + 40)^2 + (-x_2 + y_2 + 20)^2,\\ \notag s.t.\\ - x_i + 2y_j <= -10,\forall (i,j) \in \{(i,j)|i\in I, j\in J, i=j\},\\ -10 \leq y_j \leq 20, \forall j \in J.\\\]

using BilevelJuMP
using Ipopt

model = BilevelModel(Ipopt.Optimizer; mode = BilevelJuMP.ProductMode(1e-9))
An Abstract JuMP Model
Feasibility problem with:
Variables: 0
Upper Constraints: 0
Lower Constraints: 0
Bilevel Model
Solution method: BilevelJuMP.ProductMode{Float64}(1.0e-9, false, 0, nothing)
Solver name: Ipopt

First we need to create all of the variables in the upper and lower problems:

Upper level variables

@variable(Upper(model), x[i = 1:2], start = 0)
2-element Vector{BilevelVariableRef}:
 x[1]
 x[2]

Lower level variables

@variable(Lower(model), y[i = 1:2], start = -10)
2-element Vector{BilevelVariableRef}:
 y[1]
 y[2]

Then we can add the objective and constraints of the upper problem:

Upper level objecive function

@objective(Upper(model), Min, 2x[1] + 2x[2] - 3y[1] - 3y[2] - 60)

\[ 2 x_{1} + 2 x_{2} - 3 y_{1} - 3 y_{2} - 60 \]

Upper level constraints

@constraint(Upper(model), x[1] + x[2] + y[1] - 2y[2] - 40 <= 0)
@constraint(Upper(model), [i = 1:2], x[i] >= 0)
@constraint(Upper(model), [i = 1:2], x[i] <= 50)
@constraint(Upper(model), [i = 1:2], y[i] >= -10)
@constraint(Upper(model), [i = 1:2], y[i] <= 20)
2-element Vector{ConstraintRef{BilevelModel, Int64, ScalarShape}}:
 y[1] ≤ 20
 y[2] ≤ 20

Followed by the objective and constraints of the lower problem:

Lower objective function

@objective(Lower(model), Min, (-x[1] + y[1] + 40)^2 + (-x[2] + y[2] + 20)^2)

\[ x_{1}^2 - 2 y_{1}\times x_{1} + y_{1}^2 + x_{2}^2 - 2 y_{2}\times x_{2} + y_{2}^2 - 80 x_{1} + 80 y_{1} - 40 x_{2} + 40 y_{2} + 2000 \]

Lower constraints

@constraint(Lower(model), [i = 1:2], -x[i] + 2y[i] <= -10)
@constraint(Lower(model), [i = 1:2], y[i] >= -10)
@constraint(Lower(model), [i = 1:2], y[i] <= 20)
2-element Vector{ConstraintRef{BilevelModel, Int64, ScalarShape}}:
 y[1] ≤ 20
 y[2] ≤ 20

Now we can solve the problem and verify the solution again that reported by the book

optimize!(model)

primal_status(model)

termination_status(model)

objective_value(model)

value.(x)

value.(y)
2-element Vector{Float64}:
 -10.00000000063434
 -10.000000000265

Like any other optimization problem, there is a chance in bilevel

optimization to find multiple solutions with the same optimal value; based on the inherent stochasticity of the algorithm and random seed, we are expecting two optimal solutions for this problem.


This page was generated using Literate.jl.