Heuristics: Genetic Algorithm 2: Unterschied zwischen den Versionen

Aus Operations-Research-Wiki
Wechseln zu: Navigation, Suche
[unmarkierte Version][unmarkierte Version]
Zeile 28: Zeile 28:
 
'''genotype''' =coded solution of a problem
 
'''genotype''' =coded solution of a problem
 
'''
 
'''
 +
 
phenotype''' =decoded solution of a problem
 
phenotype''' =decoded solution of a problem
  

Version vom 25. Juni 2013, 09:34 Uhr

Introduction

Genetic Algorithm are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions you might otherwise find in a lifetime (Salvatore Mangano, Computer Design, May 1995)

The solution that you get from a genetic algorithm is the result of the iteratively application of different stochastic operators.


For understanding GA and how they work, we have to introduce some new vacabulary:

individual =a strucuture, that represents a solution, consisting of genes

chromosome =equal to individual

gene =a bit of the binary representation of a solution

population =quantity of individuals, that became considered by an GA

parents =two chosen individuals of a population

crossing =combination of two genes from two chromosomes

mutation =modification of a chromosome

fitness =qualtity of a solution

generation =one iteration in the duration of the optimation

genotype =coded solution of a problem

phenotype =decoded solution of a problem


Basically the strucure of Genetic Algorithm is always the same presented below

  1. produce an initial population of indivuals
  2. evaluate the fitness of all individuals
  3. WHILE termination condition not met DO
    1. select fitter individuals for reproduction
    2. recombine between individuals
    3. mutate individuals
    4. evaluate the fitness of the modified individuals
    5. generate a new population
  4. END WHILE


Example

The Maxone problem:

We want to maximize the number of ones in a string of L binary bits.

This may seem to trivial, cause we know already the answer, but however, we can think of it as maximazing the number of correct answers, each encoded by 1, ot difficult questions:

Suppose l=10, n=6

Cause everey individual is encoded as a string of l binary bit, we can write for one

1=0000000001 (10 bits)


To get the initial population, we toss a fair coin 60 times:


Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): \color{red}\sum_{i=1}^{6}f(s_i)=34


while f(s) is the fitness function


The new population after performing selection (suppose we apply a random generator and get this selection).



We only perform a crossover for the pairs and


Crossover-points: 2 and 5


Before crossover:


After crossover:


Before applying random mutation:

for each bit, that we are to copy to the new population, we allow a small probability of error, for instance 0,1



After applying mutation (random) with updated fitness-values:

Fehler beim Parsen (http://mathoid.testme.wmflabs.org Serverantwort ist ungültiges JSON.): \sum_{i=1}^{6}f(s_i''')=37



Improvement of 9 percent

You could repeat the algorithm, until a stopping criterion is met.

References

Nobal Niraula, Genetic Algorithm: http://de.slideshare.net/kancho/genetic-algorithm-by-example (download: 24.06.13)

Alfred Hermes, Heuristik: Genetischer Algorithmus am Beispiel des 0/1-Rucksackproblems: http://learntech.rwth-aachen.de/lehre/ss05/fdi/folien/VS11HTML/GenAlg.pdf (download: 24.06.13)