Presentation at the INFORMS Practice Conference, 2009.04.26–28. Discusses how computational methods, especially evolutionary computation (and genetic algorithms), can be used to find multiple "solutions of interest" to an optimization problem, and how these solutions can be useful in decision making.
Unveiling Falcon Invoice Discounting: Leading the Way as India's Premier Bill...
Model-Based Decision Making: On Supporting Deliberation with Metaheuristics
1. Introduction Further Example Extended Example Evolutionary Computation Discussion
Model-Based Decision Making
On Supporting Deliberation with Metaheuristics
Steven O. Kimbrough
University of Pennsylvania
INFORMS Practice Conference, 2009.04.26–28
1 / 58
2. Introduction Further Example Extended Example Evolutionary Computation Discussion
Acknowledgements
Collaborators: CHENG Shih-Fen, Gary Koehler, Ann Kuo,
Lindawati, LAU Hoong Chuin (HC), Ming Lu, David Harlan
Wood, D.J. Wu
2 / 58
3. Introduction Further Example Extended Example Evolutionary Computation Discussion
Outline
1 Introduction
Deliberation
Simple Example: Simple Knapsack
2 Further Example
GAP: The Generalized Assignment Problem
3 Extended Example
GAP4_2 FoIs
GAP4_2 IoIs
4 Evolutionary Computation
Motivation
Basics
Hard Problems
5 Discussion
Key Points
Vision
3 / 58
4. Introduction Further Example Extended Example Evolutionary Computation Discussion
Deliberation
Given an optimization problem. . .
How can we recognize a modeling opportunity?
How should we build the model(s)?
How can we get a good solution?
What should our decision be?
Our focus: this last question. The deliberation problem.
Recognize: The questions interact.
4 / 58
5. Introduction Further Example Extended Example Evolutionary Computation Discussion
Deliberation
Knapsack Problems
Canonical form for the Simple Knapsack model
max z =
n
i=0
pixi (1)
subject to the constraints
n
i=0
wixi ≤ c (2)
xi ∈ {0, 1}, i = 0, 1, 2, . . . , n. (3)
5 / 58
6. Introduction Further Example Extended Example Evolutionary Computation Discussion
Deliberation
Canonical GAP (= or ≤ on (6))
max z =
m
i=1
n
j=1
cijxij (4)
Subject to the constraints:
n
j=1
rijxij ≤ bi i = 1, . . . , m (5)
m
i=1
xij ≤ 1 j = 1, . . . , n (6)
xij ∈ {0, 1} ∀i, j (7)
6 / 58
7. Introduction Further Example Extended Example Evolutionary Computation Discussion
Deliberation
Deliberation
Given a good solution, provide information relevant to
implementing the decision and to reconsidering the model.
Think: post-solution analysis, sensitivity analysis,
post-evaluation analysis, etc.
The candle-lighting principle: “It is better to light one
candle than to curse the darkness.”
The best response to a tight constraint may well be to
loosen it.
Generalize the point: other aspects of the model.
Uncertainty, action.
7 / 58
8. Introduction Further Example Extended Example Evolutionary Computation Discussion
Deliberation
Shadow prices and reduced costs
From linear programming. Address the candle-lighting
principle.
How much would it be worth to loosen these constraints?
How much would it cost us to tighten these constraints?
And similar questions.
All in support of deliberation.
How can these questions be addressed outside of linear
programming models?
8 / 58
9. Introduction Further Example Extended Example Evolutionary Computation Discussion
Simple Example: Simple Knapsack
z to c Response Curve (knapsack101)
160 170 180 190 200 210 220 230 240
1000
1050
1100
1150
1200
1250
z to c Response Curve for knapsack101 Problem
c: right−hand−side values
z:objectivefunctionvalues
9 / 58
10. Introduction Further Example Extended Example Evolutionary Computation Discussion
Simple Example: Simple Knapsack
z to c Response Curve: Zooming in
195 196 197 198 199 200 201 202 203 204 205
1100
1105
1110
1115
1120
1125
1130
1135
z:objectivefunctionvalues
c: right−hand−side values
z to c Response Curve for knapsack101 Problem
10 / 58
11. Introduction Further Example Extended Example Evolutionary Computation Discussion
Simple Example: Simple Knapsack
Two Challenges
We are often interested in z as a function of many varying
parameters, not just one, c in this case.
Response surface, not response line.
The resulting search space will often be too large to
support re-solving the model comprehensively.
The original problem may be solved only expensively,
typically with a metaheuristic, so re-solving often is not
possible, even in a relatively small search space.
11 / 58
12. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Illustrative Example: GAP5_1
An organization is presented with a resource allocation
problem for which it has formulated a 0-1 integer program,
called GAP5_1.
The problem has 24 decision variables and 8 capacity
constraints, for a search space (∈ {1, 2, 3, 4, 5, 6, 7, 8}) of
824 = 272 = 4722366482869645213696 ≈ 4.7 × 1021.
A standard branch-and-bound solver (CPLEX) has found
the optimal solution, x∗, for which z∗ = 563.
Note: OR-Library, J.E. Beasley.
http://people.brunel.ac.uk/~mastjjb/jeb/info.html,
[1].
12 / 58
13. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Solving with CPLEX
CPLEX> read
Name of file to read: GAP5_1.lp
Problem ’GAP5_1.lp’ read.
Read time = 0.00 sec.
CPLEX> mipopt
...
Solution pool: 2 solutions saved.
MIP - Integer optimal solution:
Objective = 5.6300000000e+02
Solution time=0.03 sec. Iterations = 0
Nodes = 0
CPLEX>
13 / 58
14. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Should we implement x∗
?
If this were a linear programming problem. . .
We would look at shadow prices, reduced costs, &c.
Are there constraints that could be tightened at little cost
and the resources freed up deployed profitably elsewhere?
Are there constraints that could be relaxed for a cost that
would be covered by the benefit of a better solution?
Perhaps we are sellers in an combinatorial auction. We
want to consider acquiring/releasing resources in the
future. Or negotiating with certain bidders.
And so on. We would engage in deliberation (aka:
post-solution analysis, sensitivity analysis, post-evaluation
analysis).
But this isn’t LP, so what can we do? How can we get
support for this sort of deliberation?
14 / 58
15. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Sensitivity and Post-Solution Analysis
CPLEX> display sensitivity
Not available for mixed integer problems.
Use CHANGE PROBLEM to change problem type.
CPLEX>
What to do?
15 / 58
16. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Recall: Solution pool: 2 solutions saved.
CPLEX> display solution list *
Change from
Solution Name Objective Value Incumbent
p1 555.0000 8.33%
p2 563.0000 0.00%
CPLEX>
16 / 58
17. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
CPLEX> display solution member 1 slacks *
Solution p1
Constraint Name Slack Value
slack c1 1.000000
slack c3 4.000000
slack c4 7.000000
slack c7 2.000000
slack c8 2.000000
All other slacks matching ’*’ are 0.
17 / 58
18. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
CPLEX> display solution member 2 slacks *
Solution p2
Constraint Name Slack Value
slack c1 2.000000
slack c2 9.000000
slack c3 7.000000
slack c4 6.000000
slack c7 3.000000
slack c8 3.000000
All other slacks matching ’*’ are 0.
CPLEX>
18 / 58
19. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Illustrative Example:
GAP5_1 CPLEX Solution Pool Displayed in Resource Usage
Mode
Capacity Constraints
Constraint: C1 C2 C3 C4 C5 C6 C7 C8 slacks
Capacity: 36 35 38 34 32 34 31 34 —
Sol’n. Obj.
2: x∗
563 2 9 7 6 0 0 3 3 30
1: 555 1 0 4 7 0 0 2 2 16
At optimality, there is a slack of 2 on constraint 1, 9 on
constraint 2, etc.
Potentially useful information for deliberation, especially when
combined with the corresponding solutions.
19 / 58
20. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
But. . . it would be nice to have more
Are there other interesting feasible solutions?
If so, how might we get them?
What about interesting infeasible solutions?
How might we get them?
20 / 58
21. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Obvious approach: change the constraints and
re-solve the problem
In this simple example, the maximal slack value
encountered was 9. If we sweep out the range of
tightening from 1–9 on each constraint, there are 98 =
43,046,721 possibilities.
At 100 solutions per second, it would take
43046721
60×60×24×100 ≈ 5 days to make the calculations.
And what about infeasible solutions? If we also sweep these to
a depth of 9 we get 188/8640000 = 1,275.45 days.
(Efficiencies are possible, these are upper bounds, but the
basic point remains.)
21 / 58
22. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
Illustrative Example: GAP5 Optimal Solution Displayed
in Resource Usage Mode
Capacity Constraints
Constraint: C1 C2 C3 C4 C5 C6 C7 C8 Slacks
Capacity: 36 35 38 34 32 34 31 34 —
Sol’n. Obj.
x∗
563 2 0 2 5 4 1 3 2 19
At optimality, there is a slack of 2 on constraint 1, 0 on
constraint 2, etc.
22 / 58
24. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
There are (at least) two dozen alternative feasible solutions
which are close in value to the optimal solution (559/563 =
0.993) and that offer seemingly large freeing of resources.
Depending on prices (not represented in the model), some
of these alternatives may be very attractive.
We found about 2000 distinct feasible solutions with
objective function values ≥ 550. These are data for
deliberation. Note that EC also found an alternate optimal
solution.
All this with modest computational effort.
24 / 58
26. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP: The Generalized Assignment Problem
An Observation
Varying the capacities on the constraints, relaxing them
and tightening them, creates a new search space. How
large is that space?
The largest differential on the tightening side is 20. Going
with just 13 on the relaxing side, 20+13 = 33. With 8
constraints (and limiting ourselves to integers) this is
338 = 1, 406, 408, 618, 241 ≈ 1.4 × 1012. Too many times
to re-solve the problem. (Again, efficiencies are possible,
these are upper bounds, but the basic point remains.)
We need another method, even for such small problems.
How can we find interesting, non-optimal solutions?
26 / 58
27. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 FoIs
GAP4_2
A representative GAP test problem.
OR-Library, J.E. Beasley. http://people.brunel.ac.
uk/~mastjjb/jeb/info.html, [1].
c530-2 644. 5 machines, 30 jobs. 644 at optimum.
Solved with FI-2Pop GA.
FoIs: top 1000 feasible solutions, ranked by objective
value.
IoIs: top 1000 infeasible solutions, ranked by distance to
feasibility.
27 / 58
29. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 FoIs
Now let’s compare them on their constraint slacks
Comparison of Slacks
Solution 1 2 3 4 5
A=644 2 5 0 1 1
B=644 2 1 1 2 0
Decision makers may well prefer A over B or vice versa.
29 / 58
30. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 FoIs
Shadow-price-like questions
Obj. Val. 1 2 3 4 5
A=644 2 5 0 1 1
B=644 2 1 1 2 0
643 2 5 1 4 1
643 2 1 1 2 0
643 0 4 1 11 0
643 5 4 1 3 0
643 2 1 1 3 0
642 4 0 1 0 1
642 2 1 0 1 0
642 2 5 4 1 1
642 5 4 3 1 0
642 0 4 3 9 0
641 2 1 1 4 0
641 2 5 0 3 1
Can the additional slack
resources be usefully
deployed?
On which resources do
we not have much
opportunity for
redeployment?
(Ans: 2 & 5.)
30 / 58
31. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 FoIs
Shadow-price-like questions
In both of my optimal solutions, constraint 1 has a slack of
2. Are there any good solutions available with a slack of at
least 10?
Yes. #23 has a slack on constraint 1 of 10 and an objective
value of 640; #53, 11, 638; #97, 13, 636.
What about constraint 2, which already has a slack of 5?
#18, 641, 8; #68, 637, 15; #74, 637, 18.
3? At 0 or 1. #10, 642, 4; #39, 639, 8; #48, 638, 10.
4? At 1 or 2. #5, 643, 11.
5? At 0 or 1. #35, 639, 9.
Plus, we can do combinations. . .
31 / 58
32. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 FoIs
Slacks for top 100 feasible solutions
1
2
3
4
5
0
20
40
60
80
100
0
5
10
15
20
32 / 58
33. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 FoIs
Reduced-cost-like questions
Job 25 is assigned to machine 2 in solution A and machine
3 in solution B. What’s the best solution if we assign job 25
to machine 1?
Answer: It has an objective value of 643. The solution is
Job 25 to machine 1
0 0 1 2 3 4 5 6 7 8 9
0 – 3 3 5 1 2 3 4 1 4
1 2 4 2 1 4 5 5 2 2 5
2 3 4 3 3 5 1 1 4 1 5
3 2 – – – – – – – – –
Its slacks:
Obj. Val. 1 2 3 4 5
643 2 1 1 3 0
33 / 58
35. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 IoIs
Shadow-price-like questions
We’re at 644, but we need to get to 651. Show me what I
need to do.
In the top 1000 IoIs there are 75 discovered solutions with
objective values at or above 651.
35 / 58
38. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 IoIs
Reduced-cost-like questions
Job 20 is assigned to machine 3 in solutions A and B, and
to 3 (and sometimes 4) in all the FoIs.
There are 57 solutions in the IoIs in which we assign job 20
to machine 2.
What are the most promising of these solutions?
38 / 58
39. Introduction Further Example Extended Example Evolutionary Computation Discussion
GAP4_2 IoIs
Reduced-cost-like questions. 20 to 2?
Obj. 1 2 3 4 5
647 2 -2 0 6 -5
646 2 -2 -12 -8 1
646 2 -2 1 9 -5
645 0 -6 0 -5 -12
645 -8 -8 1 2 8
645 2 -2 -4 -9 9
645 4 -7 1 5 -5
645 4 -7 -1 6 -3
645 -5 -3 1 1 8
644 -8 -12 0 -1 -1
644 2 1 -4 2 -13
644 13 -2 0 1 -3
644 2 4 0 1 -12
644 2 -2 1 11 -2
644 0 1 -7 7 -5
644 0 1 1 9 -5
644 -2 1 7 0 -4
Is there
opportunity to
acquire more
resources?
To trade in
slacks and
surpluses?
Note: We do not
find an
opportunity with
20 assigned to 2
and z > 647.
39 / 58
41. Introduction Further Example Extended Example Evolutionary Computation Discussion
Motivation
How can we, how did we, find the FoIs and the IoIs?
Through evolutionary computation.
Observation: Population-based heuristics profusely and
intelligently sample the solution space in the vicinity of
good solutions.
Idea: Use these samples to derive the FoIs and IoIs.
41 / 58
42. Introduction Further Example Extended Example Evolutionary Computation Discussion
Basics
Evolutionary Algorithm Schema
1 Initialize parameters.
2 Create, P0, the initial population of solutions.
3 P ← P0
4 done ← false
5 Do while not done:
1 Evaluate and record the fitness of each member of P.
2 Select members of P to compose the next generation, P .
3 Perturb the members of P .
4 P ← P
5 If stopping condition is met, done ← true
42 / 58
43. Introduction Further Example Extended Example Evolutionary Computation Discussion
Basics
Comments
There are many, many variations on this schema. I’ll focus
prototypically on genetic algorithms, GAs.
A solution is typically a string of numbers, serving as
arguments to an evaluation function.
For example, in a simple knapsack problem, solutions
might be represented as lists of 0s and 1s, e.g.
0 0 1 0 1 1
for a 6-item knapsack problem. Note: many other
encodings and representations are possible, e.g., to
include representations of parameters.
43 / 58
44. Introduction Further Example Extended Example Evolutionary Computation Discussion
Basics
Comments (con’t.)
Typically, the initial population in a GA consists of
randomly-generated solutions.
Useful: seed the initial population with promising solutions
obtained otherwise.
Fitness evaluation is performed on each solution. This is
typically the most expensive operation and is recognized
as the standard point of comparison for different methods.
Selection can be done in many different ways, but it is
always done on the basis of fitness.
We use tournament selection . . .
44 / 58
45. Introduction Further Example Extended Example Evolutionary Computation Discussion
Basics
Comments (con’t.)
Perturbation of the next generation population members
can be done in many, many different ways.
We are using just mutation and recombination by
single-point crossover. Prototypical for a simple GA.
Mutation: randomly, with some probability, change the
value of a locus, e.g.,
0 0 1 0 1 1 −→ 0 1 1 0 1 1
45 / 58
46. Introduction Further Example Extended Example Evolutionary Computation Discussion
Basics
Comments (con’t.)
Recombination by single-point crossover: randomly, with
some probability, exchange information between two
parents/members of the next generation, e.g.,
0 0 1 0 1 1 −→ 0 0 1 0 0 0
0 0 1 1 0 0 −→ 0 0 1 1 1 1
Analogous to sexual recombination.
Stopping condition.
Take your pick. We fix the number of generations (new
populations created) in order fix the number of fitness
evaluations, for comparing different GAs.
46 / 58
47. Introduction Further Example Extended Example Evolutionary Computation Discussion
Hard Problems
Hard Problems
Require heuristic solvers.
With more nuance, most interesting formalized decision
problems are NP-hard and hence (we assume) have no
fast (polynomial) solver in the worst case.
But many such problems are routinely solved very well with
fast solvers.
And many are not. For these we are forced to use
heuristics, including meta-heuristics, such as GAs and
various forms of evolutionary computation, tabu search,
simulated annealing, particle swarm optimization, artificial
immune systems, ant colony optimization, bespoke
heuristics, . . .
47 / 58
48. Introduction Further Example Extended Example Evolutionary Computation Discussion
Hard Problems
Constrained Optimization Models: Sources of Hard
Problems
Integer programming, mixed integer programming,
non-linear programming, stochastic programming,
multi-objective problems, . . . all typically present hard
problems for which only heuristic solvers are available.
Explosion of interest in applying heuristics and
meta-heuristics to ConOptModels.
Including extensive use in practice.
48 / 58
49. Introduction Further Example Extended Example Evolutionary Computation Discussion
Hard Problems
Canonical Form for ConOptModels
max
x
z = d(x) (8)
subject to the constraints
fi(x) ≤ ai, i = 1, 2, . . . , nf (9)
gj(x) = bj, j = 1, 2, . . . , ng (10)
xl ∈ Sl, l = 1, 2, . . . , nl (11)
Figure : Canonical form for single-objective, deterministic
ConOptModels. x = x1, x2, . . . , xnl
49 / 58
50. Introduction Further Example Extended Example Evolutionary Computation Discussion
Hard Problems
But, there’s a problem for GAs (and others) with
constrained optimization
The genetic operators—mutation, recombination,
etc.—generally do not respect feasibility.
How to handle constraint violations when a GA is being
use to solve a ConOptModel?
Approaches to handling infeasible solutions: death penalty
(wasteful), penalty functions (but which?), special encoding
to guarantee feasibility (rarely available), repair of
infeasible solutions (problem-specific; computationally
costly, often).
The penalty function approach is perhaps most natural and
is widely used (e.g., MATLAB). Still, a number of published
worries.
50 / 58
51. Introduction Further Example Extended Example Evolutionary Computation Discussion
Hard Problems
Our New Idea, FI2Pop GA: Separate Feasibility and
Optimality Evaluations
Maintain two populations in the GA: a population of
feasible solutions and a population of infeasible solutions.
Fitness in the feasible population is determined solely by
the objective function.
Fitness in the infeasible population is determine solely as a
function of distance to feasibility. (We used the sum of
constraint violations.)
Each population is processed in turn with a conventional
GA. Children go to the feasible or infeasible population,
depending on. . . whether they are feasible or infeasible.
51 / 58
52. Introduction Further Example Extended Example Evolutionary Computation Discussion
Hard Problems
Performance Testing
Over a series of studies, we have found excellent
performance on (apparently) hard problems.
We have also investigated, both theoretically and
empirically, the behavior of the FI2Pop GA. [2]
Note: Mindful of the No Free Lunch Theorems.
Typical behavior: FI-2Pop GA is competitive with penalty GAs
on FoIs and superior on IoIs.
52 / 58
53. Introduction Further Example Extended Example Evolutionary Computation Discussion
Key Points
Summary
Key for deliberation support:
FoIs: Feasible solutions of Interest: Feasible, non-optimal
solutions with good objective values.
IoIs: Infeasible solutions of Interest: Infeasible solutions
near feasibility with good objective values.
These can be found by intelligent sampling of the solutions.
This is just what metaheuristics, such as GAs, do.
53 / 58
54. Introduction Further Example Extended Example Evolutionary Computation Discussion
Key Points
Concept
If we had all possible solutions, we could fully answer our
post-solution deliberation questions. (But we can’t have
that.)
If we could get a representative random sample of all the
solutions, we could have good estimates for our
post-solution deliberation questions. (But the search
spaces are too large.)
If we could bias our sampling to the regions of interest
(near the boundary(ies) of the feasible region(s)), we could
have good estimates for our post-solution deliberation
questions.
That’s what we are reporting. And effective means to do it.
54 / 58
55. Introduction Further Example Extended Example Evolutionary Computation Discussion
Key Points
To be done
Explore metaheuristics fully for purposes of deliberation
support.
Two categories:
Individual, local search based.
Tabu search, simulated annealing, etc.
Population based search.
Evolutionary computation, particle swarm optimization, ant
colony optimization, artificial immune systems, scatter
search, etc.
What works best for producing FoIs and IoIs? Scaling
issues? OGIT.
55 / 58
56. Introduction Further Example Extended Example Evolutionary Computation Discussion
Vision
We envision. . .
A transformation in how complex COModels are used in
practice.
Estimates given for all post-solution questions of interest.
A great broadening of the scope of use and application of
COModels.
An empowering of users, as information for decision
support is made readily available.
56 / 58
57. Introduction Further Example Extended Example Evolutionary Computation Discussion
Vision
J. E. Beasley.
OR-Library.
World Wide Web, Accessed July 27, 2009.
http://people.brunel.ac.uk/~mastjjb/jeb/
info.html.
Steven Orla Kimbrough, Gary J. Koehler, Ming Lu, and
David Harlan Wood.
On a feasible–infeasible two–population (FI-2Pop) genetic
algorithm for constrained optimization: Distance tracing and
no free lunch.
European Journal of Operational Research,
190(2):310–327, 2008.
57 / 58
58. Introduction Further Example Extended Example Evolutionary Computation Discussion
Vision
$Id: deliberation-COModel-master-beamer.tex 3607 2013-06-23 06:20:09Z sok $
58 / 58