An Introduction to Evolutionary Algorithmsusers.jyu.fi › ~jhaka › uppsala › Lecture2_An... ·...
Transcript of An Introduction to Evolutionary Algorithmsusers.jyu.fi › ~jhaka › uppsala › Lecture2_An... ·...
An Introduction to Evolutionary Algorithms
Karthik Sindhya, PhD
Postdoctoral Researcher
Industrial Optimization Group
Department of Mathematical Information Technology
[email protected] http://users.jyu.fi/~kasindhy/
• Nature Inspired Algorithms
• Differential Evolution algorithm
• Constraint handling
• Applications
Overview
Nature Inspired Algorithms
• Nature provide some of the efficient ways to solve problems
– Algorithms imitating processes in nature/inspired from nature – Nature Inspired Algorithms.
• What type of problems?
– Aircraft wing design
• Wind turbine design
• Bionic car
Nature Inspired Algorithms
BBC
Performance improvement by 40%. They reduce turbulence across the surface, increasing angle of attack and decreasing drag. (Source: Popular Mechanics)
Hexagonal plates - resulting in door paneling one-third lighter than conventional paneling, but just as strong. (Source: Popular Mechanics)
• Bullet train
Nature Inspired Algorithms
NATGEO
Train's nose is designed after the beak of a kingfisher, which dives smoothly into water. (Source: Popular Mechanics)
• Optimization – An act, process, or methodology of making something (as a design, system, or
decision) as fully perfect, functional, or effective as possible. (http://www.merriam-
webster.com/dictionary)
• Nature as an optimizer – Birds: Minimize drag.
– Humpback whale: Maximize maneuverability (enhanced lift devices to control flow over the flipper and maintain lift at high angles of attack).
– Boxfish: Minimize drag and maximize rigidity of exoskeleton.
– Kingfisher: Minimize micro-pressure waves.
• Consider an optimization problem of the form
Nature Inspired Algorithms for Optimization
• Objective and constraint functions can be non-differentiable.
• Constraints nonlinear. • Discrete/Discontinuous search space. • Mixed variables (Integer, Real, Boolean etc.) • Large number of constraints and variables. • Objective functions can be multimodal.
– Multimodal functions have more than one optima, but can either have a single or more than one global optima.
• Computationally expensive objective functions and constraints.
Practical Optimization Problems – Charecteristics!
Practical Optimization Problems – Charecteristics!
Simulation model
Decision vector Objective vector
Optimization algorithm
• Different methods for different types of problems.
• Constraint handling e.g. using panalty method is sensitive to penalty parameters.
• Often get stuck in local optima (lack global perspective).
• Usually need knowledge of first/second order derivatives of objective functions and constraints.
Traditional Optimization Techniques – Problems!
Computational intelligence
Nature inspired algorithms
Fuzzy logic systems
Neural networks
Nature Inspired Algorithms for Optimization
Nature inspired algorithms
Evolutionary algorithms
Genetic algorithm
Differential evolution
Swarm optimization
Particle swarm optimization
Ant colony optimization
.... and many more.
Nature Inspired Algorithms for Optimization
Evolution
Humans
Macintosh
Nokia
Evolutionary Algorithms
Charles Darwin
Offsprings created by reproduction, mutation, etc.
Natural selection - A guided search procedure
Individuals suited to the environment survive, reproduce and pass their genetic traits to offspring
Populations adapt to their environment. Variations accumulate over time to generate new species
• Terminologies 1. Individual - carrier of the genetic information (chromosome). It
is characterized by its state in the search space, its fitness (objective function value).
2. Population - pool of individuals which allows the application of genetic operators.
3. Fitness function - The term “fitness function” is often used as a synonym for objective function.
4. Generation - (natural) time unit of the EA, an iteration step of an evolutionary algorithm.
Evolutionary Algorithms
Population
Individual
Crossover
Mutation
Parents Offspring
Evolutionary Algorithms
Evolutionary Algorithms
Step 1 t:= 0
Step 2 Initialize P(t)
Step 3 Evaluate P(t)
Step 4 While not terminate do P’(t) := variation [P(t)]; evaluate [P’(t)]; P(t+1) := select [P’(t) U P(t)]; t := t + 1; od
Reproduced from “Evolutionary Computation: Comments on the History and Current State” – Bäack et. al
Evolutionary algorithms = Selection + Crossover + Mutation
Evolutionary Algorithms
• Mean approaches optimum
• Variance reduces
Evolutionary Algorithms
Effi
cie
ncy
Problem type
Random scheme
Robust scheme
Robustness = Breadth + Efficiency
(Goldberg, 1989)
• Selection - Roulette wheel, Tournement, steady state, etc.
– Motivation is to preserve the best (make multiple copies) and eliminate the worst
• Crossover – simulated binary crossover, Linear crossover, blend crossover, etc.
– Create new solutions by considering more than one individual
– Global search for new and hopefully better solutions
• Mutation – Polynomial mutation, random mutation, etc. – Keep diversity in the population
– 010110 →010100 (bit wise mutation)
Evolutionary Algorithms
• Tournament selection
Evolutionary Algorithms
23 30
24 11
37 9
24 24
11 9
9 11 23
37 24
30
11
9
Tournament 1 Tournament 2
37 30 Deleted from population
• Roulette wheel selection (proportional selection)
– Weaker solutions can survive.
Evolutionary Algorithms
• Concept of exploration vs exploitation.
• Exploration – Search for promising solutions • Crossover and mutation operators
• Exploitation – preferring the good solutions • Selection operator
• Excessive exploration – Random search.
• Excessive exploitation – Premature convergence.
Evolutionary Algorithms
Good evolutionary algorithm
Exploitation Exploration
Evolutionary Algorithms
Classical gradient based algorithms
• Convergence to an optimal solution usually depends on the starting solution.
• Most algorithms tend to get stuck to a locally optimal solution.
• An algorithm efficient in solving one class of optimization problem may not be efficient in solving others.
• Algorithms cannot be easily parallelized.
Evolutionary algorithms
• Convergence to an optimal solution is designed to be independent of initial population.
• A search based algorithm. Population helps not to get stuck to locally optimal solution.
• Can be applied to wide class of problems without major change in the algorithm.
• Can be easily parallelized.
Evolutionary Algorithms
Fitness Landscapes
x
f(x) Ideal and best case
f(x)
x
Multimodal
Nightmare
f(x)
x
f(x)
x
Teaser
Using traditional gradient based methods
Fitness Landscapes
x
f(x) Ideal and best case
f(x)
x
Multimodal
Nightmare
f(x)
x
f(x)
x
Teaser
Using population based algorithms
• GA: John Holland in 1962 (UMich) • Evolutionary Strategy: Rechenberg and Schwefel in 1965
(Berlin) • Evolutionary Programming: Larry Fogel in 1965 (California) • First ICGA: 1985 in Carnegie Mellon University • First GA book: Goldberg (1989) • First FOGA workshop: 1990 in Indiana (Theory) • First Fusion: 1990s (Evolutionary Algorithms) • Journals: ECJ (MIT Press), IEEE TEC, Natural Computation
(Elsevier) • GECCO and CEC since 1999, PPSN since 1990 • About 20 major conferences each year
History of Evolutionary Algorithms
• Proposed by R. Storn and K. Price (1997) – Storn, R., Price, K. (1997). "Differential evolution - a simple and efficient heuristic for global optimization over continuous
spaces“, Journal of Global Optimization 11: 341–359.
• A population based approach for minimization of functions – A maximization function is converted to a
minimization function (max f(x) = -min(f(x))
• Parameters to be set – NP, Population size (5 – 10) x number of variables – F, Scaling factor [0,2] – CR, Crossover ratio [0,1] – NGEN, Maximum number of generations
Differential Evolution
Differential Evolution
x4-x5
v1 = x3 + F(x4-x5)
X1 x2 x3 x4 x5
f1 f2 f3 f4 f5
P1 P2 P3 P4 P5
X
Y
Z
fI fII fI < fII
X1
fI Y
X1
fII
N
X1 x2 x3 x4 x5
f1 f2 f3 f4 f5
C5 C1 C2 C3 C4
Target vector
Trial vector
Y
Z
Rand < CR
Mutation
Crossover
• DE Scheme
– DE/x/y/z
• x: specifies the vector to be mutated which currently is “rand”.
• y: number of difference vectors used.
• z: denotes the crossover scheme. The current variant is “bin”. Also ‘exp’ is available.
• DE/rand/1/bin
Differential Evolution
Differential Evolution
x4 x5
Minimum
v1 = x3 + F(x4-x5)
x3 v1
x1
x2
Mutation
t1
v1
c2
c1
x1
x2
Crossover
Constraint Handling
• Penalty parameter-less approach
– A feasible solution is preferred to infeasible solution
– When both solutions feasible, choose the solution with better function value
– When both solutions are infeasible, choose the solution with lower constraint violation
• Box constraints
– If variable is lower/higher than lower/upper bound,
• set to lower/upper bound
• A random value inside the bounds
Constraint Handling
• No guarantee of finding an optimal solution in finite time – Asymptotic convergence
• Containing a number of parameters – Sometimes the result is highly dependent on the
parameters set
– Self-adaptive parameters are commonly used
• Computationally very expensive – Metamodels of functions are commonly used
Limitations of Evolutionary Algorithms
• Application 1
– Tracking suspect
• Caldwell and Johnston, 1991
• Objective function: fitness rating on a nine point scale
Applications
• Optimization (Min/Max) of functions
• Airfoil optimization
• Evolving optimal structure
• Games
Applications