SlideShare une entreprise Scribd logo
1  sur  31
Télécharger pour lire hors ligne
Self-Adaptive Simulated Binary Crossover
     for Real-Parameter Optimization



   Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe*
                     GECCO 2007

                   *Indian Inst. India
              **Honda Research Inst., Japan



                       Reviewed by

                   Paskorn Champrasert
                   paskorn@cs.umb.edu
                   http://dssg.cs.umb.edu
Outline
• Simulated Binary Crossover (SBX Crossover )
       (revisit)
• Problem in SBX
• Self-Adaptive SBX
• Simulation Results
   – Self-Adaptive SBX for Single-OOP
   – Self-Adaptive SBX for Multi-OOP
• Conclusion


 November 7, 08        DSSG Group Meeting       2/31
SBX
            Simulated Binary Crossover
• Simulated Binary Crossover (SBX) was proposed in
  1995 by Deb and Agrawal.

• SBX was designed with respect to the one-point
  crossover properties in binary-coded GA.
    – Average Property:
      The average of the decoded parameter values is the same
      before and after the crossover operation.
    – Spread Factor Property:
      The probability of occurrence of spread factor β ≈ 1 is more
      likely than any other β value.


November 7, 08              DSSG Group Meeting                       3/31
Important Property of One-Point Crossover
Spread factor:
  Spread factor β is defined as the ratio of the spread of
  offspring points to that of the parent points:
                             c1 −c2
                  β=       | p1 −p2 |
                                                               c1   c2

- Contracting Crossover      β<1                         p1              p2
   The offspring points are enclosed by the parent points.

- Expanding Crossover        β>1                    c1                        c2

   The offspring points enclose by the parent points. p1                 p2

- Stationary Crossover      β=1                               c1              c2
    The offspring points are the same as parent points        p1              p2
 November 7, 08              DSSG Group Meeting                               4/31
Spread Factor (β) in SBX
• The probability distribution of β in SBX should be similar (e.g.
   same shape) to the probability distribution of β in Binary-coded
   crossover.
                                  c(β) = 0.5(nc + 1)β nc , β ≤ 1
 • The probability distribution
 function is defined as:           c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                          c

                                                                  1.6
                                                                  1.4




                                       Probability Distribution
                                                                  1.2
                                                                  1.0                   n=2
                                                                  0.8
                                                                  0.6
                                                                  0.4
                                                                  0.2
                                                                  0.0
                                                                        0   1   2   3    4    5   6    7
                                                                                    β



November 7, 08              DSSG Group Meeting                                                        5/31
A large value of n gives a higher
                              probability for creating ‘near-parent’
contracting                   solutions.
   c(β) = 0.5(n + 1)β n , β ≤ 1



                      expanding
                                            1
                       c(β) = 0.5(n + 1) β n+2 , β > 1




                                                 Mostly, n=2 to 5
November 7, 08              DSSG Group Meeting                      6/31
Spread Factor (β) as a random number
          • β is a random number that follows the proposed
            probability distribution function:
                           1.6
                           1.4
                                                                         To get a β value,
Probability Distribution




                           1.2
                                                                         1) A unified random number u
                                                     n=2
                                                                            is generated [0,1]
                           1.0
                           0.8
                           0.6                                           2) Get β value that makes the
                           0.4
                           0.2
                                                                            area under the curve = u
                           0.0
                                     u
                                 0   1       2   3    4    5   6     7
                                                 β                          Offspring:
                                         β
                                                                                    c1 = x − 1 β(p2 − p1 )
                                                                                         ¯ 2
                                                                                    c2 = x + 1 β(p2 − p1 )
                                                                                         ¯ 2
                    November 7, 08                                 DSSG Group Meeting                    7/31
SBX Summary
• Simulated Binary Crossover (SBX) is a real-parameter
  combination operator which is commonly used in
  evolutionary algorithms (EA) to optimization problems.

    – SBX operator uses two parents and creates two offspring.

    – SBX operator uses a parameter called the distribution index
      (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation
      run.
          • If nc is large the offspring solutions are close to the parent solutions
          • If nc is small the offspring solutions are away from the parent solutions

    – The distribution index (nc) has a direct effect in controlling the
      spread of offspring solutions.
 November 7, 08                     DSSG Group Meeting                                  8/31
Research Issues
•     Finding the distribution index (nc) to an appropriate value
      is an important task.
    –      The distribution index (nc) has effect in
          1) Convergence speed
                if nc is very high the offspring solutions are very close to the parent solutions the
                     convergence speed is very low.
          2) Local/Global optimum solutions.
             Past studies of SBX crossover GA found that it cannot work well in
             complicated problems (e.g., Restrigin’s function)




    November 7, 08                         DSSG Group Meeting                                     9/31
Self-Adaptive SBX
• This paper proposed a procedure for updating the
  distribution index (nc) adaptively to solve optimization
  problems.

    – The distribution index (nc) is updated every generation.

    – How the distribution index of the next generation (nc) is
      updated (increased or decreased) depends on how the
      offspring outperform (has better fitness value) than its two
      parents.
• The next slides will show the process to make an
  equation to update the distribution index.

November 7, 08               DSSG Group Meeting                      10/31
Probability density per offspring
         Spread factor distribution
                           1.6
                           1.4
                                                                                   (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0                             nc = 2
                                                                                   (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                               c

                           0.8
                           0.6                                                                              ¯
                                                                                An example, when p1=2, p2=5 β>1
                           0.4
                           0.2
                           0.0             u
                                 0         1       2   3      4     5   6   7
                                                       β
                                               β                                                             (1)   (2)

         Offspring:
                                  ¯ 2¯
                             c1 = x − 1 β(p2 − p1 )                                              (1)   (2)

                                  ¯ 2¯
                             c2 = x + 1 β(p2 − p1 )



                            November 7, 08                                  DSSG Group Meeting                           11/31
Self-Adaptive SBX (1)
Consider the case that β > 1:

                                 c1                     c2
                     β>1
                                      p1               p2

                    c2 −c1
     β=           | p2 −p1 |
             (c2 −p2 )+(p2 −p1 )+(p1 −c1 )
    β=                  p2 −p1

    (c2 − p2 ) = (p1 − c1 )

                    2(c2 −p2 )
     β =1+           p2 −p1
                                  --- (1)

 November 7, 08                   DSSG Group Meeting         12/31
Self-Adaptive SBX (2)
                           1.6
                           1.4
                                                                                 (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                           1.2       (1)       (2)

                           1.0
                                                                                 (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                             c

                           0.8
                           0.6
                           0.4
                           0.2
                           0.0
                                           u
                                 0         1       2   3      4   5   6   7
                                                       β
                                               β
                                                       (2)
                                                           Rβ  0.5(nc +1)
                                                                                = (u − 0.5)           --- (2)
                                                             1   β nc +2 dβ




                            November 7, 08                                DSSG Group Meeting                        13/31
(2)
                   Self-Adaptive SBX (3)
 Rβ  0.5(nc +1)
                                            --- (2)
  1    β nc +2 dβ = (u − 0.5)
Rβ
 1
    0.5(nc + 1)β −(nc +2) dβ =            (u − 0.5)

−0.5β −nc −1 − (−0.5) = (u − 0.5)
−0.5β −nc −1 = (u − 1)
log(β −nc −1 ) = log − (u−1)
                        0.5

(−nc − 1) log β = log(−2(u − 1))
                   log 2(1−u)
(−nc − 1) =           log β

                         log 2(1−u)
            nc = −(1 +      log β   )                --- (3)
  November 7, 08                DSSG Group Meeting             14/31
Self-Adaptive SBX (4)
                 c1                c2    c2
                                         ˆ
β>1
                      p1      p2

• If c2 is better than both parent solutions,
    – the authors assume that the region in which the child
      solutions is created is better than the region in which
      the parents solutions are.
    – the authors intend to extend the child solution further
      away from the closet parent solution (p2) by a factor α
      (α >1)


                 (c2 − p2 ) = α(c2 − p2 )
                  ˆ                                --- (4)


November 7, 08                DSSG Group Meeting             15/31
β Update
                 c1                    c2   c2
                                            ˆ
β>1                   p1          p2

                 2(c2 −p2 )
β =1+             p2 −p1
                                --- (1)      (c2 − p2) = α(c2 − p2) --- (4)
                                              ˆ


from (1)         ˆ
                 β =1+        2(c2 −p2 )
                                ˆ
                               p2 −p1

                              2((α(c2 −p2 )+p2 )−p2 )
                   =1+                p2 −p1
                              α2(c2 −p2 )
                   =1+          p2 −p1
                                                   β−1
  ˆ
  β = 1 + α(β − 1) --- (5)
November 7, 08                    DSSG Group Meeting                      16/31
Distribution Index (nc) Update
nc = −(1 +         log 2(1−u)
                              )   --- (3)        ˆ
                                                 β = 1 +α(β −1) --- (5)
                      log β

                   log 2(1−u)
nc = −(1 +
ˆ                         ˆ
                      log β
                              )

                     log 2(1−u)
nc = −(1 +
ˆ                  log(1+α(β−1)) )


                  log 2(1 − u) = −(nc + 1) log β

                                    (nc +1) log β
                  nc = −1 +
                  ˆ               log(1+α(β−1))        --- (6)



 November 7, 08                   DSSG Group Meeting                 17/31
Distribution Index (nc) Update




When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (6). The distribution index will be
decreased in order to create an offspring (c’) further away from its
parent(p2) than the offspring (c) in the previous generation.
                    (nc +1) log β
 nc = −1 +
 ˆ                log(1+α(β−1))        --- (6)       [0,50]
                           α is a constant (α >1 )

 November 7, 08               DSSG Group Meeting                       18/31
Distribution Index (nc) Update
When c is worse than p1 and p2 then the distribution index for the
next generation is calculated as (7). The distribution index will be
increased in order to create an offspring (c’) closer to its
parent(p2) than the offspring (c) in the previous generation.

The 1/α is used instead of α

                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+α(β−1))           --- (6)



                      (nc +1) log β
      nc = −1 +
      ˆ             log(1+(β−1)/α)          --- (7)


                                                    α is a constant (α >1 )
 November 7, 08                DSSG Group Meeting                             19/31
Distribution Index (nc) Update
         So, we are done with the expanding case                                                   (2)


                              How about contracting case                               (1)     ?
                                                        The same process is applied for                  (1)


                            1.6
                            1.4
                                                                                         (1)   c(β) = 0.5(nc + 1)β nc , β ≤ 1
Probability Distribution




                            1.2
                            1.0
                                                                                         (2)   c(β) = 0.5(nc + 1) β n1+2 , β > 1
                                                                                                                     c

                            0.8   (1)         (2)
                            0.6
                            0.4
                            0.2
                            0.0
                                      u
                                  0       1         2     3   4         5    6     7
                                                          β
                                        β                         (1)
                                                                   Rβ
                                                                        0
                                                                            0.5(nc + 1)β nc dβ = u                   --- (x)
                           November 7, 08                                        DSSG Group Meeting                            20/31
Distribution Index (nc) Update
                         for contracting case
When c is better than p1 and p2 then the distribution index for the
next generation is calculated as (8). The distribution index will be
decreased in order to create an offspring (c’) father away from its
parent(p2) than the offspring (c) in the previous generation.

                  1+nc                   --- (8)
       nc =
       ˆ           α     −1


 When c is worse than p1 and p2, 1/α is used instead of α

        nc = α(1 + nc ) − 1
        ˆ                                  --- (9)

                                                   α is a constant (α >1 )
 November 7, 08               DSSG Group Meeting                             21/31
Distribution Index Update Summary

                  Expanding Case                      Contracting Case
                     (u > 0.5)                            (u<0.5)
c is better than parents
                                                              1+nc
           nc = −1 +
           ˆ                 (nc +1) log β             ˆ
                                                       nc =    α     −1
                           log(1+α(β−1))


c is worse than parents
                             (nc +1) log β
          nc = −1 +
          ˆ                log(1+(β−1)/α)
                                                      nc =α(1+nc) −1
                                                      ˆ


  (u = 0.5), nc = nc
             ˆ
 November 7, 08                  DSSG Group Meeting                       22/31
Self-SBX Main Loop

                                                             Update nc


                                                         If rand < pc

                                              Parent 1                              Offspring C1
          Pt
                             Selection                             Crossover
        initial                               Parent 2                              Offspring C2
      population


                                                    If rand > pc
                   N = 150                                          Mutation
                   nc = 2
                   α = 1.5




                                                                      Pt+1
                                                                   Population
                                                                     at t+1


                                                                                N = 150
                                                                                α = 1.5
November 7, 08                           DSSG Group Meeting                                        23/31
Simulation Results
Sphere Problem:

 SBX
 • Initial nc is 2.
 • Population size =150
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5



November 7, 08                 DSSG Group Meeting                        24/31
Fitness Value

                                   • Self-SBX can
                                     find better
                                     solutions




November 7, 08      DSSG Group Meeting              25/31
α Study
                         • α is varied in [1.05,2]
                         • The effect of α is not
                           significant since the
                           average number of
                           function evaluations is
                           similar.




November 7, 08   DSSG Group Meeting              26/31
Simulation Results
Rastrigin’s Function:

 Many local optima, one global minimum (0)
 The number of variables = 20
 SBX
 • Initial nc is 2.
 • Population size =100
 • Pc = 0.9
 • The number of variables (n) =30
 • Pm =1/30
 • A run is terminated when a solution having a function value = 0.001 or the maximum
   of 40,000 generations is reached.
 • 11 runs are applied.

 Self-SBX
 • Pc = 0.7
 • α = 1.5


November 7, 08                               DSSG Group Meeting                   27/31
Fitness Value



                       Best        Median   Worst



• Self-SBX can find a very close global optimal




November 7, 08      DSSG Group Meeting              28/31
α Study
                              • α is varied in
                                [1.05,10]
                              • The effect of α is
                                not significant since
                                the average number
                                of function
                                evaluations is
                                similar.
                              • α = 3 is the best

November 7, 08   DSSG Group Meeting                29/31
Self-SBX in MOOP
                       • Self-SBX is applied in
                         NSGA2-II.
                       • Larger hyper-volume is
                         better




November 7, 08        DSSG Group Meeting          30/31
Conclusion
• Self-SBX is proposed in this paper.
    – The distribution index is adjust adaptively on the run
      time.
• The simulation results show that Self-SBX can
  find the better solutions in single objective
  optimization problems and multi-objective
  optimization problems.

• However, α is used to tune up the distribution
  index.

November 7, 08           DSSG Group Meeting                31/31

Contenu connexe

Tendances

Prml 最尤推定からベイズ曲線フィッティング
Prml 最尤推定からベイズ曲線フィッティングPrml 最尤推定からベイズ曲線フィッティング
Prml 最尤推定からベイズ曲線フィッティングtakutori
 
臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜
臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜
臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜Yasuyuki Okumura
 
ウィナーフィルタと適応フィルタ
ウィナーフィルタと適応フィルタウィナーフィルタと適応フィルタ
ウィナーフィルタと適応フィルタToshihisa Tanaka
 
Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」
Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」
Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」ssuserfd0ee61
 
凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜
凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜
凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜Tomoki Yoshida
 
PRML輪読#6
PRML輪読#6PRML輪読#6
PRML輪読#6matsuolab
 
Graph Neural Networks
Graph Neural NetworksGraph Neural Networks
Graph Neural Networkstm1966
 
[DL輪読会]Deep Learning 第17章 モンテカルロ法
[DL輪読会]Deep Learning 第17章 モンテカルロ法[DL輪読会]Deep Learning 第17章 モンテカルロ法
[DL輪読会]Deep Learning 第17章 モンテカルロ法Deep Learning JP
 
オープンデータのためのスクレイピング
オープンデータのためのスクレイピングオープンデータのためのスクレイピング
オープンデータのためのスクレイピング直之 伊藤
 
観察データを用いた因果推論に共変量選択
観察データを用いた因果推論に共変量選択観察データを用いた因果推論に共変量選択
観察データを用いた因果推論に共変量選択Jaehyun Song
 
PRML 8.2 条件付き独立性
PRML 8.2 条件付き独立性PRML 8.2 条件付き独立性
PRML 8.2 条件付き独立性sleepy_yoshi
 
セグメントツリーにセグメントツリーを乗せる手法(画像付録)
セグメントツリーにセグメントツリーを乗せる手法(画像付録)セグメントツリーにセグメントツリーを乗せる手法(画像付録)
セグメントツリーにセグメントツリーを乗せる手法(画像付録)光喜 濱屋
 
PRML復々習レーン#2 2.3.6 - 2.3.7
PRML復々習レーン#2 2.3.6 - 2.3.7PRML復々習レーン#2 2.3.6 - 2.3.7
PRML復々習レーン#2 2.3.6 - 2.3.7sleepy_yoshi
 
ベイズ推論とシミュレーション法の基礎
ベイズ推論とシミュレーション法の基礎ベイズ推論とシミュレーション法の基礎
ベイズ推論とシミュレーション法の基礎Tomoshige Nakamura
 
Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片
Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片
Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片Chyi-Tsong Chen
 
Rでソーシャルネットワーク分析
Rでソーシャルネットワーク分析Rでソーシャルネットワーク分析
Rでソーシャルネットワーク分析Hiroko Onari
 

Tendances (20)

Prml 最尤推定からベイズ曲線フィッティング
Prml 最尤推定からベイズ曲線フィッティングPrml 最尤推定からベイズ曲線フィッティング
Prml 最尤推定からベイズ曲線フィッティング
 
臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜
臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜
臨床疫学研究における傾向スコア分析の使い⽅ 〜観察研究における治療効果研究〜
 
ウィナーフィルタと適応フィルタ
ウィナーフィルタと適応フィルタウィナーフィルタと適応フィルタ
ウィナーフィルタと適応フィルタ
 
Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」
Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」
Jij 数理最適化入門セミナー第1回「数理モデルを記述するためのJijModelingの紹介」
 
Chapter6.4
Chapter6.4Chapter6.4
Chapter6.4
 
PRML11章
PRML11章PRML11章
PRML11章
 
凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜
凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜
凸最適化 〜 双対定理とソルバーCVXPYの紹介 〜
 
PRML輪読#6
PRML輪読#6PRML輪読#6
PRML輪読#6
 
Graph Neural Networks
Graph Neural NetworksGraph Neural Networks
Graph Neural Networks
 
[DL輪読会]Deep Learning 第17章 モンテカルロ法
[DL輪読会]Deep Learning 第17章 モンテカルロ法[DL輪読会]Deep Learning 第17章 モンテカルロ法
[DL輪読会]Deep Learning 第17章 モンテカルロ法
 
オープンデータのためのスクレイピング
オープンデータのためのスクレイピングオープンデータのためのスクレイピング
オープンデータのためのスクレイピング
 
観察データを用いた因果推論に共変量選択
観察データを用いた因果推論に共変量選択観察データを用いた因果推論に共変量選択
観察データを用いた因果推論に共変量選択
 
PRML 8.2 条件付き独立性
PRML 8.2 条件付き独立性PRML 8.2 条件付き独立性
PRML 8.2 条件付き独立性
 
セグメントツリーにセグメントツリーを乗せる手法(画像付録)
セグメントツリーにセグメントツリーを乗せる手法(画像付録)セグメントツリーにセグメントツリーを乗せる手法(画像付録)
セグメントツリーにセグメントツリーを乗せる手法(画像付録)
 
PRML復々習レーン#2 2.3.6 - 2.3.7
PRML復々習レーン#2 2.3.6 - 2.3.7PRML復々習レーン#2 2.3.6 - 2.3.7
PRML復々習レーン#2 2.3.6 - 2.3.7
 
ベイズ推論とシミュレーション法の基礎
ベイズ推論とシミュレーション法の基礎ベイズ推論とシミュレーション法の基礎
ベイズ推論とシミュレーション法の基礎
 
Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片
Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片
Data Driven Process Optimization Using Real-Coded Genetic Algorithms ~陳奇中教授演講投影片
 
Rでソーシャルネットワーク分析
Rでソーシャルネットワーク分析Rでソーシャルネットワーク分析
Rでソーシャルネットワーク分析
 
Chapter2.3.6
Chapter2.3.6Chapter2.3.6
Chapter2.3.6
 
動的計画法
動的計画法動的計画法
動的計画法
 

Similaire à Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functionsErik Tjersland
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Matthew Lease
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Martin Pelikan
 
Lecture6
Lecture6Lecture6
Lecture6voracle
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notesmbetzel
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoTakeru INOUE
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equationsErik Tjersland
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Jim Jenkins
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpadMedia4math
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering CS, NcState
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Dataswissnex San Francisco
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKedadk
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewtaeseon ryu
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Matthew Leingang
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognitionIntel Nervana
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer TransformIain Richardson
 
Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)
Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)
Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)Takeru INOUE
 

Similaire à Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization (20)

Day 1 intro to functions
Day 1 intro to functionsDay 1 intro to functions
Day 1 intro to functions
 
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
Lecture 2: Data-Intensive Computing for Text Analysis (Fall 2011)
 
Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses Computational complexity and simulation of rare events of Ising spin glasses
Computational complexity and simulation of rare events of Ising spin glasses
 
Lecture6
Lecture6Lecture6
Lecture6
 
Alg2.7 A Notes
Alg2.7 A NotesAlg2.7 A Notes
Alg2.7 A Notes
 
Kai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s DynamoKai – An Open Source Implementation of Amazon’s Dynamo
Kai – An Open Source Implementation of Amazon’s Dynamo
 
Day 16 representing with equations
Day 16 representing with equationsDay 16 representing with equations
Day 16 representing with equations
 
Presentation
PresentationPresentation
Presentation
 
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
Fundamentals of Engineering Probability Visualization Techniques & MatLab Cas...
 
6.2 notes
6.2 notes6.2 notes
6.2 notes
 
28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad28. distance formulatemplatetouchpad
28. distance formulatemplatetouchpad
 
The business case for automated software engineering
The business case for automated software engineering The business case for automated software engineering
The business case for automated software engineering
 
Distilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental DataDistilling Free-Form Natural Laws from Experimental Data
Distilling Free-Form Natural Laws from Experimental Data
 
On recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEKOn recent improvements in the conic optimizer in MOSEK
On recent improvements in the conic optimizer in MOSEK
 
Locating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper reviewLocating objects without_bounding_boxes - paper review
Locating objects without_bounding_boxes - paper review
 
Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)Lesson 21: Curve Sketching (Section 10 version)
Lesson 21: Curve Sketching (Section 10 version)
 
1.2.4
1.2.41.2.4
1.2.4
 
Anil Thomas - Object recognition
Anil Thomas - Object recognitionAnil Thomas - Object recognition
Anil Thomas - Object recognition
 
The H.264 Integer Transform
The H.264 Integer TransformThe H.264 Integer Transform
The H.264 Integer Transform
 
Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)
Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)
Kai – An Open Source Implementation of Amazon’s Dynamo (in Japanese)
 

Dernier

Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGSujit Pal
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slidespraypatel2
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 

Dernier (20)

Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAG
 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 

Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization

  • 1. Self-Adaptive Simulated Binary Crossover for Real-Parameter Optimization Kalyanmoy Deb*, S Karthik*, Tatsuya Okabe* GECCO 2007 *Indian Inst. India **Honda Research Inst., Japan Reviewed by Paskorn Champrasert paskorn@cs.umb.edu http://dssg.cs.umb.edu
  • 2. Outline • Simulated Binary Crossover (SBX Crossover ) (revisit) • Problem in SBX • Self-Adaptive SBX • Simulation Results – Self-Adaptive SBX for Single-OOP – Self-Adaptive SBX for Multi-OOP • Conclusion November 7, 08 DSSG Group Meeting 2/31
  • 3. SBX Simulated Binary Crossover • Simulated Binary Crossover (SBX) was proposed in 1995 by Deb and Agrawal. • SBX was designed with respect to the one-point crossover properties in binary-coded GA. – Average Property: The average of the decoded parameter values is the same before and after the crossover operation. – Spread Factor Property: The probability of occurrence of spread factor β ≈ 1 is more likely than any other β value. November 7, 08 DSSG Group Meeting 3/31
  • 4. Important Property of One-Point Crossover Spread factor: Spread factor β is defined as the ratio of the spread of offspring points to that of the parent points: c1 −c2 β= | p1 −p2 | c1 c2 - Contracting Crossover β<1 p1 p2 The offspring points are enclosed by the parent points. - Expanding Crossover β>1 c1 c2 The offspring points enclose by the parent points. p1 p2 - Stationary Crossover β=1 c1 c2 The offspring points are the same as parent points p1 p2 November 7, 08 DSSG Group Meeting 4/31
  • 5. Spread Factor (β) in SBX • The probability distribution of β in SBX should be similar (e.g. same shape) to the probability distribution of β in Binary-coded crossover. c(β) = 0.5(nc + 1)β nc , β ≤ 1 • The probability distribution function is defined as: c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 1.6 1.4 Probability Distribution 1.2 1.0 n=2 0.8 0.6 0.4 0.2 0.0 0 1 2 3 4 5 6 7 β November 7, 08 DSSG Group Meeting 5/31
  • 6. A large value of n gives a higher probability for creating ‘near-parent’ contracting solutions. c(β) = 0.5(n + 1)β n , β ≤ 1 expanding 1 c(β) = 0.5(n + 1) β n+2 , β > 1 Mostly, n=2 to 5 November 7, 08 DSSG Group Meeting 6/31
  • 7. Spread Factor (β) as a random number • β is a random number that follows the proposed probability distribution function: 1.6 1.4 To get a β value, Probability Distribution 1.2 1) A unified random number u n=2 is generated [0,1] 1.0 0.8 0.6 2) Get β value that makes the 0.4 0.2 area under the curve = u 0.0 u 0 1 2 3 4 5 6 7 β Offspring: β c1 = x − 1 β(p2 − p1 ) ¯ 2 c2 = x + 1 β(p2 − p1 ) ¯ 2 November 7, 08 DSSG Group Meeting 7/31
  • 8. SBX Summary • Simulated Binary Crossover (SBX) is a real-parameter combination operator which is commonly used in evolutionary algorithms (EA) to optimization problems. – SBX operator uses two parents and creates two offspring. – SBX operator uses a parameter called the distribution index (nc) which is kept fixed (i.e., 2 or 5) throughout a simulation run. • If nc is large the offspring solutions are close to the parent solutions • If nc is small the offspring solutions are away from the parent solutions – The distribution index (nc) has a direct effect in controlling the spread of offspring solutions. November 7, 08 DSSG Group Meeting 8/31
  • 9. Research Issues • Finding the distribution index (nc) to an appropriate value is an important task. – The distribution index (nc) has effect in 1) Convergence speed if nc is very high the offspring solutions are very close to the parent solutions the convergence speed is very low. 2) Local/Global optimum solutions. Past studies of SBX crossover GA found that it cannot work well in complicated problems (e.g., Restrigin’s function) November 7, 08 DSSG Group Meeting 9/31
  • 10. Self-Adaptive SBX • This paper proposed a procedure for updating the distribution index (nc) adaptively to solve optimization problems. – The distribution index (nc) is updated every generation. – How the distribution index of the next generation (nc) is updated (increased or decreased) depends on how the offspring outperform (has better fitness value) than its two parents. • The next slides will show the process to make an equation to update the distribution index. November 7, 08 DSSG Group Meeting 10/31
  • 11. Probability density per offspring Spread factor distribution 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 nc = 2 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 ¯ An example, when p1=2, p2=5 β>1 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) (2) Offspring: ¯ 2¯ c1 = x − 1 β(p2 − p1 ) (1) (2) ¯ 2¯ c2 = x + 1 β(p2 − p1 ) November 7, 08 DSSG Group Meeting 11/31
  • 12. Self-Adaptive SBX (1) Consider the case that β > 1: c1 c2 β>1 p1 p2 c2 −c1 β= | p2 −p1 | (c2 −p2 )+(p2 −p1 )+(p1 −c1 ) β= p2 −p1 (c2 − p2 ) = (p1 − c1 ) 2(c2 −p2 ) β =1+ p2 −p1 --- (1) November 7, 08 DSSG Group Meeting 12/31
  • 13. Self-Adaptive SBX (2) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 (1) (2) 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (2) Rβ 0.5(nc +1) = (u − 0.5) --- (2) 1 β nc +2 dβ November 7, 08 DSSG Group Meeting 13/31
  • 14. (2) Self-Adaptive SBX (3) Rβ 0.5(nc +1) --- (2) 1 β nc +2 dβ = (u − 0.5) Rβ 1 0.5(nc + 1)β −(nc +2) dβ = (u − 0.5) −0.5β −nc −1 − (−0.5) = (u − 0.5) −0.5β −nc −1 = (u − 1) log(β −nc −1 ) = log − (u−1) 0.5 (−nc − 1) log β = log(−2(u − 1)) log 2(1−u) (−nc − 1) = log β log 2(1−u) nc = −(1 + log β ) --- (3) November 7, 08 DSSG Group Meeting 14/31
  • 15. Self-Adaptive SBX (4) c1 c2 c2 ˆ β>1 p1 p2 • If c2 is better than both parent solutions, – the authors assume that the region in which the child solutions is created is better than the region in which the parents solutions are. – the authors intend to extend the child solution further away from the closet parent solution (p2) by a factor α (α >1) (c2 − p2 ) = α(c2 − p2 ) ˆ --- (4) November 7, 08 DSSG Group Meeting 15/31
  • 16. β Update c1 c2 c2 ˆ β>1 p1 p2 2(c2 −p2 ) β =1+ p2 −p1 --- (1) (c2 − p2) = α(c2 − p2) --- (4) ˆ from (1) ˆ β =1+ 2(c2 −p2 ) ˆ p2 −p1 2((α(c2 −p2 )+p2 )−p2 ) =1+ p2 −p1 α2(c2 −p2 ) =1+ p2 −p1 β−1 ˆ β = 1 + α(β − 1) --- (5) November 7, 08 DSSG Group Meeting 16/31
  • 17. Distribution Index (nc) Update nc = −(1 + log 2(1−u) ) --- (3) ˆ β = 1 +α(β −1) --- (5) log β log 2(1−u) nc = −(1 + ˆ ˆ log β ) log 2(1−u) nc = −(1 + ˆ log(1+α(β−1)) ) log 2(1 − u) = −(nc + 1) log β (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) November 7, 08 DSSG Group Meeting 17/31
  • 18. Distribution Index (nc) Update When c is better than p1 and p2 then the distribution index for the next generation is calculated as (6). The distribution index will be decreased in order to create an offspring (c’) further away from its parent(p2) than the offspring (c) in the previous generation. (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) [0,50] α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 18/31
  • 19. Distribution Index (nc) Update When c is worse than p1 and p2 then the distribution index for the next generation is calculated as (7). The distribution index will be increased in order to create an offspring (c’) closer to its parent(p2) than the offspring (c) in the previous generation. The 1/α is used instead of α (nc +1) log β nc = −1 + ˆ log(1+α(β−1)) --- (6) (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) --- (7) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 19/31
  • 20. Distribution Index (nc) Update So, we are done with the expanding case (2) How about contracting case (1) ? The same process is applied for (1) 1.6 1.4 (1) c(β) = 0.5(nc + 1)β nc , β ≤ 1 Probability Distribution 1.2 1.0 (2) c(β) = 0.5(nc + 1) β n1+2 , β > 1 c 0.8 (1) (2) 0.6 0.4 0.2 0.0 u 0 1 2 3 4 5 6 7 β β (1) Rβ 0 0.5(nc + 1)β nc dβ = u --- (x) November 7, 08 DSSG Group Meeting 20/31
  • 21. Distribution Index (nc) Update for contracting case When c is better than p1 and p2 then the distribution index for the next generation is calculated as (8). The distribution index will be decreased in order to create an offspring (c’) father away from its parent(p2) than the offspring (c) in the previous generation. 1+nc --- (8) nc = ˆ α −1 When c is worse than p1 and p2, 1/α is used instead of α nc = α(1 + nc ) − 1 ˆ --- (9) α is a constant (α >1 ) November 7, 08 DSSG Group Meeting 21/31
  • 22. Distribution Index Update Summary Expanding Case Contracting Case (u > 0.5) (u<0.5) c is better than parents 1+nc nc = −1 + ˆ (nc +1) log β ˆ nc = α −1 log(1+α(β−1)) c is worse than parents (nc +1) log β nc = −1 + ˆ log(1+(β−1)/α) nc =α(1+nc) −1 ˆ (u = 0.5), nc = nc ˆ November 7, 08 DSSG Group Meeting 22/31
  • 23. Self-SBX Main Loop Update nc If rand < pc Parent 1 Offspring C1 Pt Selection Crossover initial Parent 2 Offspring C2 population If rand > pc N = 150 Mutation nc = 2 α = 1.5 Pt+1 Population at t+1 N = 150 α = 1.5 November 7, 08 DSSG Group Meeting 23/31
  • 24. Simulation Results Sphere Problem: SBX • Initial nc is 2. • Population size =150 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 24/31
  • 25. Fitness Value • Self-SBX can find better solutions November 7, 08 DSSG Group Meeting 25/31
  • 26. α Study • α is varied in [1.05,2] • The effect of α is not significant since the average number of function evaluations is similar. November 7, 08 DSSG Group Meeting 26/31
  • 27. Simulation Results Rastrigin’s Function: Many local optima, one global minimum (0) The number of variables = 20 SBX • Initial nc is 2. • Population size =100 • Pc = 0.9 • The number of variables (n) =30 • Pm =1/30 • A run is terminated when a solution having a function value = 0.001 or the maximum of 40,000 generations is reached. • 11 runs are applied. Self-SBX • Pc = 0.7 • α = 1.5 November 7, 08 DSSG Group Meeting 27/31
  • 28. Fitness Value Best Median Worst • Self-SBX can find a very close global optimal November 7, 08 DSSG Group Meeting 28/31
  • 29. α Study • α is varied in [1.05,10] • The effect of α is not significant since the average number of function evaluations is similar. • α = 3 is the best November 7, 08 DSSG Group Meeting 29/31
  • 30. Self-SBX in MOOP • Self-SBX is applied in NSGA2-II. • Larger hyper-volume is better November 7, 08 DSSG Group Meeting 30/31
  • 31. Conclusion • Self-SBX is proposed in this paper. – The distribution index is adjust adaptively on the run time. • The simulation results show that Self-SBX can find the better solutions in single objective optimization problems and multi-objective optimization problems. • However, α is used to tune up the distribution index. November 7, 08 DSSG Group Meeting 31/31