This work studies the utility of using substructural neighborhoods for local search in the Bayesian optimization algorithm (BOA). The probabilistic model of BOA, which automatically identifies important problem substructures, is used to define the structure of the neighborhoods used in local search. Additionally, a surrogate fitness model is considered to evaluate the improvement of the local search steps. The results show that performing substructural local search in BOA significatively reduces the number of generations necessary to converge to optimal solutions and thus provides substantial speedups.
High Class Call Girls Nagpur Grishma Call 7001035870 Meet With Nagpur Escorts
The Bayesian Optimization Algorithm with Substructural Local Search
1. The Bayesian Optimization
Algorithm with Substructural
Local Search
Claudio Lima, Martin Pelikan, Kumara Sastry,
Martin Butz, David Goldberg, and Fernando Lobo
2. Overview
Motivation
Bayesian Optimization Algorithm (BOA)
Modeling fitness in BOA
Substructural Neighborhoods
BOA with Substuctural Hillclimbing
Results
Conclusions
Future Work
2
OBUPM 2006
3. Motivation
Probabilistic models of EDAs allow better
recombination of subsolutions
Get we can more from these models? Yes!
Efficiency enhancement on EDAs
Evaluation relaxation
Local search in substructural neighborhoods
3
OBUPM 2006
4. Bayesian Optimization Algorithm
Pelikan, Goldberg, and Cantú-Paz (1999)
Use Bayesian networks to model good solutions
Model structure => acyclic directed graph
Nodes represent variables
Edges represent conditional dependencies
Model parameters => conditional probabilities
Conditional Probability Tables based on the observed
frequencies
Local structures: Decision Trees or Graphs
4
OBUPM 2006
5. Learning a Bayesian Network
Start with an empty network (independence
assumption)
Perform operation that improves the metric the
most
Edge addition, edge removal, edge reversal
Metric quantifies the likelihood of the model wrt data
(good solutions)
Stop when no more improvement is possible
5
OBUPM 2006
6. A 3-bit Example
Model Structure Model Parameters
Directed Acyclic Graph Conditional Probability Tables Decision Trees
X2X3 P(X1=1|X2X3) X2
X2
X3 00 0.20 0 1
01 0.20
X3
P(x1=1) = 0.20
10 0.15
X1 0 1
11 0.45
P(x1=1) = 0.15 P(x1=1) = 0.45
6
OBUPM 2006
7. Modeling Fitness in BOA
Bayesian networks extended to store a
surrogate fitness model (Pelikan & Sastry,2004)
The surrogate fitness is learned from a
proportion of the population...
...and is used to estimate the fitness of the
remaining individuals (therefore reducing evals)
7
OBUPM 2006
9. Why Substructural Neighborhoods?
An efficient mutation operator should search in
the correct neighborhood
Oftentimes this is done by incorportaring
domain- or problem-specific knowledge
However, efficiency typically does not generalize
beyond a small number of applications
Bitwise local search have more general
applicability but with inferior results
9
OBUPM 2006
10. Substructural Neighborhoods
Neighborhoods defined by the probabilistic
model of EDAs
Exploits the underlying problem structure while
not loosing generality of application
Exploration of neighborhoods respect
dependencies between variables
If [X1X2X3] form a linkage group, the neighborhood
considered will be 000, 001, 010, ..., 111
10
OBUPM 2006
11. Substructural Local Search
For uniformly-scaled decomposable problems,
substructural local search scales as 0(2km1.5)
(Sastry & Goldberg, 2004)
Bitwise hillclimber: O( mk log(m) )
Extended Compact GA with substructural local
search is more robust than either single-
operator-based aproaches (Lima et al., 2005)
11
OBUPM 2006
12. Substructural Neighborhoods in BOA
Model is more complex than in eCGA
What is a linkage group? Which dependencies to
consider? Is order relevant?
Example: topology of 3 different substructural
neighborhoods for variable X2:
12
OBUPM 2006
13. BOA + Substructural Hillclimbing
After model sampling each offspring undergoes
local search with a certain probability pls
Current model is used to define the
neighborhoods
Choice of best subsolutions => surrogate fitness
model
Cost of performing local search is then minimal
13
OBUPM 2006
15. Substructural Hillclimbing in BOA
Use reverse ancestral ordering of variables
2 different versions of the substructural
hillclimber (step 3)
Evaluated fitness
Estimated fitness
Result of local search is evaluated
15
OBUPM 2006
18. Onemax Results (l=50)
Correctness of substructural neighborhoohs is
not relevant...
...but the choice of subsolutions relies on the
accuracy of the surrogate fitness model
More important, the acceptance of the best
subsolutions depends also on the surrogate, if
using estimated fitness
18
OBUPM 2006
20. 10x5-bit trap Results (l=50)
Correct identification of problem substructure is
crucial
Different versions of the hillclimber perform
similar (for small pls)
Cost of using evaluated fitness increases
significatively with pls (and with problem size)
Phase transition in the population size required
20
OBUPM 2006
22. Scalability Results (5-bit traps)
Substancial speedups are obtained (η=6 for
l=140)
Speedup scales as O(l0.45) for l<80
For bigger problem sizes the speedup is more
moderate
pls=5x10-4 adequate for range of problems
tested, but optimal proportion should decrease
for higher problem sizes
22
OBUPM 2006
24. Scalability Issues
Optimal proportion of local search slowly
decreases with problem size
Exploration of substructural neighborhoods is
sensitive to the accuracy of model structure
Spurious linkage size grows with problem size
BOA’s sampling ability is not affected because
conditional probabilities nearly express
independence between spurious and linked
variables
24
OBUPM 2006
25. Future Work
Model optimal proportion of local search pls
Get more accurate model structures
Only accept pairwise depedencies that improve
metric beyond some threshold (significance test)
Study the improvement function of the metric
Consider other neighborhood topologies
Consider overlapping substructures
25
OBUPM 2006
26. Conclusions
Incorporation of substructural local search in
BOA leads to significant speedups
Use of surrogate fitness in local search provides
effective learning of substructures with minimal
cost on evals.
The importance of designing and hybridizing
competent operators have been empirically
demonstrated
26
OBUPM 2006