Ce diaporama a bien été signalé.
Nous utilisons votre profil LinkedIn et vos données d’activité pour vous proposer des publicités personnalisées et pertinentes. Vous pouvez changer vos préférences de publicités à tout moment.



Evolving Neural Networks Through Augmenting Topologies

Livres associés

Gratuit avec un essai de 30 jours de Scribd

Tout voir

Evolving Neural Networks Through Augmenting Topologies

  1. 1. Evolving Neural Networks through Augmenting Topologies Authors Kenneth O. Stanley Risto Miikkulainen Speaker Daniele Loiacono
  2. 2. Outline <ul><li>Introduction to Genetic Algorithms … in less than 5 minutes … </li></ul><ul><li>Introduction to Neural Networks … in about 5 minutes … </li></ul><ul><li>Neuroevolution </li></ul><ul><li>NEAT </li></ul>
  3. 3. GA
  4. 4. Outline <ul><li>Introduction to Genetic Algorithms </li></ul><ul><li>Introduction to Neural Networks </li></ul><ul><li>Neuroevolution </li></ul><ul><li>NEAT </li></ul>
  5. 5. The population David Brogan, 2006
  6. 6. Ranking by fitness David Brogan, 2006
  7. 7. Mate Selection Selection probability is proportional to fitness David Brogan, 2006
  8. 8. Crossover Exploit goodness of parents David Brogan, 2006
  9. 9. Mutation Explore unknown David Brogan, 2006
  10. 10. Best solution David Brogan, 2006
  11. 11. David Brogan, 2006
  12. 12. NN
  13. 13. Outline <ul><li>Introduction to Genetic Algorithms </li></ul><ul><li>Introduction to Neural Networks </li></ul><ul><li>Neuroevolution </li></ul><ul><li>NEAT </li></ul>
  14. 14. Neural Networks <ul><li>No much more than a black box… </li></ul>Input Output ?
  15. 15. Neural Networks <ul><li>Just a glimpse inside the box </li></ul><ul><li>Neurons </li></ul><ul><li>Layers </li></ul><ul><li>Connections and weights </li></ul>Input Output
  16. 16. Learning weights of NNs <ul><li>Weights of NNs can be learned from a set of samples <x,y>, in order to minimize the network error </li></ul><ul><li>Many algorithms have been introduced in the literature (e.g. error backpropagation) </li></ul><ul><li>Unfortunately: </li></ul><ul><ul><li>we do not have always a set of samples </li></ul></ul><ul><ul><li>learning can get stuck in local minima </li></ul></ul>
  17. 17. Design of network topology <ul><li>What is the best network topology for solving a given problem? </li></ul><ul><li>A key factor for a successful application of NNs! </li></ul><ul><li>Unfortunately is a trial and error process! </li></ul>? ?
  18. 18. NE and TWEANN
  19. 19. Outline <ul><li>Introduction to Genetic Algorithms </li></ul><ul><li>Introduction to Neural Networks </li></ul><ul><li>Neuroevolution </li></ul><ul><li>NEAT </li></ul>
  20. 20. Neuroevolution (NE) <ul><li>Artificial evolution of NN using GA </li></ul><ul><li>Fitness is the evaluation of NN performance </li></ul><ul><li>Advantages: </li></ul><ul><ul><li>no need for set of samples </li></ul></ul><ul><ul><li>avoid local minima </li></ul></ul><ul><li>Evolving what? </li></ul><ul><ul><li>weights in a fixed network topology </li></ul></ul><ul><ul><li>topology along with weights ( TWEANN ) </li></ul></ul>
  21. 21. TWEANN issues <ul><li>Encoding </li></ul><ul><ul><li>direct </li></ul></ul><ul><ul><li>indirect </li></ul></ul><ul><li>Mating (Crossover) </li></ul><ul><ul><li>competing conventions </li></ul></ul><ul><ul><li>free topology and the Holy Grail </li></ul></ul><ul><li>Protecting Innovations </li></ul><ul><li>Initialization and topology minimization </li></ul><ul><li>Examples of TWEANN systems </li></ul>
  22. 22. TWEANN Encoding <ul><li>How to encode a neural network? </li></ul><ul><li>Direct encoding </li></ul><ul><ul><li>genome specifies explicitly the phenotype </li></ul></ul><ul><ul><li>many encoding proposed (from binary encoding to graph encoding) </li></ul></ul><ul><li>Indirect Encoding </li></ul><ul><ul><li>genome specify how to build the network </li></ul></ul><ul><ul><li>allow for a more compact representation </li></ul></ul><ul><ul><li>can be biased </li></ul></ul>
  23. 23. Mating in TWEANN <ul><li>How can we mate two networks in meaningful way? </li></ul><ul><li>Challenging even for small networks with the same topology: </li></ul>X n1 ↔ n’1 n2 ↔ n’2 n3 ↔ n’3 n1 n2 n3 n’1 n’2 n’3
  24. 24. Mating in TWEANN <ul><li>How can we mate two networks in meaningful way? </li></ul><ul><li>Challenging even for small networks with the same topology: </li></ul>Competing Conventions Problem … A B C A B C A C
  25. 25. Mating in TWEANN <ul><li>For fixed and constrained topologies the competing conventions problem can be solved with tailored encoding </li></ul><ul><li>But with free topology? </li></ul>The Holy Grail (Radcliffe, 1993)
  26. 26. Protecting innovations <ul><li>Innovations usually involves larger networks with more connections </li></ul><ul><li>Larger networks require more time to be optimized and cannot compete with smaller ones (at least in the short run) </li></ul><ul><li>Speciation is an effective technique to avoid competition and mating between networks too different </li></ul>
  27. 27. Initialization and topology minimization <ul><li>Initial population of random networks </li></ul><ul><ul><li>networks not connected </li></ul></ul><ul><ul><li>heterogeneous networks to mate </li></ul></ul><ul><li>How can network topology be minimized? </li></ul><ul><ul><li>networks bigger than necessary are expensive to optimize </li></ul></ul><ul><ul><li>fitness penalty </li></ul></ul>
  28. 28. Structured Genetic Algorithm (sGA) Dasgupta and McGregor (1992) <ul><li>Binary encoding: a bit string represents the connective matrix of networks </li></ul><ul><li>Advantages </li></ul><ul><ul><li>Evolution can be performed with a standard GA </li></ul></ul><ul><li>Drawbacks </li></ul><ul><ul><li>Computational space is square w.r.t. the number of nodes </li></ul></ul><ul><ul><li>Number of nodes limited </li></ul></ul>
  29. 29. Breeder Genetic Programming (Zhang and Muhlenbein) <ul><li>Encoding with tree </li></ul><ul><li>Only crossover adapts topology </li></ul><ul><li>Fitness has an explicit penalty term for bigger networks </li></ul><ul><li>How do we set the penalty? Does it depend from the problem? </li></ul>
  30. 30. Parallel Distributed Genetic Programming (PDGP) Pujol and Poli (1997) <ul><li>Dual encoding: </li></ul><ul><ul><li>linear genome for mutating weights </li></ul></ul><ul><ul><li>Graph representation for a subgraph swapping crossover </li></ul></ul><ul><li>Does swapping crossover guarantee a good mating? </li></ul>
  31. 31. GeNeralized Acquisition of Recurrent Links (GNARL) Angeline, Saunders, and Pollack (1993) <ul><li>Graph Encoding </li></ul><ul><li>Avoid mating because considered disruptive </li></ul>
  32. 32. Cellular Encoding Gruau (1993, 1996) <ul><li>Indirect encoding (developmental) </li></ul><ul><li>Biased search in topology space </li></ul><ul><li>Mating results are unpredictable </li></ul><ul><li>More compact representation </li></ul><ul><li>Effective but not always more effective than direct encoding </li></ul>
  33. 33. Enforced SubPopulations (ESP) Gomez and Miikkulainen (1997,1999) <ul><li>Fixed topology </li></ul><ul><li>When it fails starts from scratch with a new random topology </li></ul><ul><li>Faster than Cellular Encoding! </li></ul>
  34. 34. NEAT
  35. 35. Outline <ul><li>Introduction to Genetic Algorithms </li></ul><ul><li>Introduction to Neural Networks </li></ul><ul><li>Neuroevolution </li></ul><ul><li>NEAT </li></ul>
  36. 36. NEAT <ul><li>NeuroEvolution of Augmenting Topology solve effectively the TWEANN issues thanks to: </li></ul><ul><ul><li>Historical Markings </li></ul></ul><ul><ul><li>Speciation </li></ul></ul><ul><ul><li>Complexification </li></ul></ul>
  37. 37. Genetic Encoding in NEAT
  38. 38. Topological Innovation
  39. 39. Link Weights Mutation <ul><li>A random number is added or subtracted from the current weight/parameter </li></ul><ul><li>The number can be chosen from uniform, Gaussian (normal) or other distributions </li></ul>
  40. 40. Topology Matching Problem <ul><li>When mating networks they may have gene with the same functionality but in different positions </li></ul><ul><li>In nature homologous genes are aligned and matched ( synapsis ) </li></ul><ul><li>How can we solve this problem ? </li></ul>
  41. 41. Historical Markings <ul><li>Two genes with the same history are expected to be homologous </li></ul>
  42. 42. Artificial Synapsis in NEAT
  43. 43. Speciation in NEAT <ul><li>At the beginning topological innovation may not be an advantage </li></ul><ul><li>In order to let survive useful innovations, they have to be protected </li></ul><ul><li>NEAT solve this problem with speciation : </li></ul><ul><ul><li>similar networks are clustered </li></ul></ul><ul><ul><li>competition and mating is restricted to networks with the same space </li></ul></ul><ul><ul><li>fitness sharing prevent from a single species taking over the whole population </li></ul></ul>
  44. 44. Network similarity in NEAT <ul><li>Based on historical markings </li></ul><ul><li>A compatibility distance is computed on the basis of the excess (E), disjoint (D) and the average weight distance W of matching genes: </li></ul>
  45. 45. Fitness Sharing in NEAT <ul><li>The fitness of each network is normalized with respect to the number of networks of the same species </li></ul><ul><li>At each species is assigned a number of offspring in the next generation proportional to the fitness sum of the species members. </li></ul><ul><li>Offspring are generated for each species by mating the fittest r% members of the species </li></ul><ul><li>Speciation gives to promising topology the chance of being optimized without the need of competing with all the other networks </li></ul><ul><li>Fitness sharing prevents from a single species taking over the whole population (i.e. it keeps diversity) </li></ul>
  46. 46. Complexification <ul><li>In NEAT population is initialized with networks having the identical minimal topology (all inputs directly connected to outputs) </li></ul><ul><li>More complex topology are introduced with mutation and survives only if useful </li></ul><ul><li>Complexification has two advantages: </li></ul><ul><ul><li>make trivial the initialization of the population </li></ul></ul><ul><ul><li>keeps limited the dimensionality of search space through the whole evolutionary process </li></ul></ul>
  47. 47. Validation of NEAT <ul><li>NEAT has been tested on the double pole balancing problem (also without velocities info) </li></ul><ul><li>Results shows that NEAT solves problems more effectively and faster than previous TWEANN systems </li></ul><ul><li>Each of the three key components of NEAT is necessary </li></ul>
  48. 48. APPS
  49. 49. Applications <ul><li>Games </li></ul><ul><li>Navigation </li></ul><ul><li>Vehicle warning systems </li></ul>
  50. 50. Conclusions <ul><li>NEAT is an effective approach to NE thanks to </li></ul><ul><ul><li>Historical marking </li></ul></ul><ul><ul><li>Speciation </li></ul></ul><ul><ul><li>Complexification </li></ul></ul><ul><li>Directions </li></ul><ul><ul><li>Developmental Encoding </li></ul></ul><ul><ul><li>Adaptive Synapses </li></ul></ul><ul><ul><li>rtNEAT </li></ul></ul><ul><ul><li>Competitive Coevolution </li></ul></ul>
  • mattepste1n

    Jun. 9, 2018
  • YukiKajita

    Aug. 7, 2017
  • alishahi12

    May. 13, 2017
  • FrederickLie

    Nov. 12, 2016
  • EricOgier

    Aug. 25, 2016
  • takefumiyamamura3

    Jun. 22, 2016
  • JosephJin

    Jun. 2, 2016
  • thanhvuz231

    Jan. 28, 2016
  • MikeBurn1

    Oct. 4, 2015
  • eccyan

    Jun. 18, 2015
  • jugalg1

    Apr. 6, 2015
  • AdrianoGil

    Feb. 19, 2015
  • adameth

    Mar. 24, 2014
  • ee7warren

    Jan. 29, 2014
  • amirtal3

    Dec. 9, 2013
  • lelian88

    Nov. 13, 2010
  • tnapitupulu

    Nov. 5, 2009
  • lucas49

    Oct. 3, 2009


Nombre de vues

8 320

Sur Slideshare


À partir des intégrations


Nombre d'intégrations









Mentions J'aime