Six Degrees of Separation to Darrell Whitley

Post on 12-Jan-2016

90 views 0 download

description

Six Degrees of Separation to Darrell Whitley. Everything is connected. Improved Squeaky Wheel Optimisation for Driver Scheduling, Uwe Aickelin, Edmund K. Burke, Jingpeng Li. SWO is an algorithm based on a construction-analysis-prioritization cycle. - PowerPoint PPT Presentation

Transcript of Six Degrees of Separation to Darrell Whitley

12/Sept/2006

Session #4PPSN XI

Reykjavik, Iceland

JJ Merelo GuervósDept. of Computer Architecture and Technology

U. of Granada Six Degrees of Separation to Darrell Whitley

2

Everything is connected

3

Improved Squeaky Wheel Optimisation for Driver Scheduling,

Uwe Aickelin, Edmund K. Burke, Jingpeng Li

SWO is an algorithm based on a construction-analysis-prioritization cycle.

Improved ISWO introduces

selection and mutation within

the solution.

Each component must prove its

fitness.Driver scheduling involves partitioning blocks of work (1 vehicle each) into legal shifts

set covering integer linear

programming problem.

4

Search bias in ant colony optimization: on the role ofcompetition-balanced systems Blum, C.; Dorigo, M.

5

An Evolutionary Approach to the Inference ofPhylogenetic Networks,

Juan Diego Trujillo and Carlos Cotta

Trying to find a phylogenetic network that models a set of sequences of molecular data using EAs

Networks include reticulation events

(horizontal transfer, recombination,

hybridization).

Heuristics based genetic operators.

Fitness function based on likelihood.

Network models close to evolutionary

model hidden in the data.

6

Yao, X. Evolving artificial neural networks(1999) Proceedings of the IEEE, 87 (9), pp. 1423-1447

7

Genetic Algorithm based on Independent ComponentAnalysis for Global Optimization

Gang Li, Kin Hong Lee, Kwong Sak Leung

ICA projects an n-dimensional set to a lower-dimensional space.

Since components are independent

of each other, they can be

independently maximized.

Solutions are comparable to other

algorithms, but fewer fitness

evaluations.

8

X. Yao and Y. Liu. Fast evolution strategies

9

When Do Heavy-Tail Distributions Help?Hansen, Gemperle, Auger, and Koumoutsakos

Cauchy distribution is an example of heavy-tail.

As opposed to gaussian.

Studies the probability of sampling

a better solution using different

Cauchy distribution.

Anisotropic Cauchy obtains

exceptionally good results on

Rastrigin function.

10

Yao, X., Liu, Y. (1998)Towards designing artificial neural networks by evolution

11

Neuroevolution with Analog Genetic EncodingPeter Dürr, Claudio Mattiussi, and Dario Floreano

Recent paper by Banzhaf et al. In Nature Reviews-Genetics request following more closely known biological facts in EC to convert it into computational evolution.

This paper uses an encoding for neural nets closer to

real genomes

Based on tokens that represent problem objects.

With operators to match

12

Castillo-Valdivieso, P.A.et al. (2002) Statistical analysis of the parameters of a neuro-genetic algorithm

13

Assortative mating drastically alters themagnitude of error thresholds

Gabriela Ochoa and Klaus Jaffe

Beyond the error threshold, evolved structures cannot be reproduced in the quasispecies evolution model (Eigen and Schuster).

In EC, related to the

exploration/exploitation balance.

Assortative mating produces the

highest error threshold, whereas

asexual reproduction produces the

lowest.

14

Eiben, Hinterding, Michalewicz (1999)Parameter control in evolutionary algorithms

15

Cumulative Step Length Adaptation on RidgeFunctions

Dirk V. Arnold

Ridge functions used to test ES.

This paper studies the

performance of

multirecombinative ES with

cumulative step length adaptation

for different ridge topologies.

16

M. Herdy. Reproductive isolation as strategy parameter in

hierarchically organized evolutionstrategies

17

Self-Adaptation on the Ridge Function Class: FirstResults for the Sharp Ridge

Hans-Georg Beyer and Silja Meyer-Nieberg

Different self-adaptation mechanism.

Different ridge: sharp ridge, in this

case.

Different ES: non-recombinative

strategies.

Why could it fail+

18

Arabas, J., Michalewicz, Z., Mulawka, J. (1994)GAVaPS, A Genetic Algorithm with Varying

Population Size.

19

Self-regulated Population Size inEvolutionary Algorithms

Carlos Fernandes and Agostinho Rosa

There are many self-regulated population algorithms: GAVaPS, APGA, ProFiGa:

Funky acronym not a requisite.

Some based on age.

SRP-EA combines CHC and GAVaPS.

Achieves better success rates, with a

penalty in the number of evaluations.

20

Is Self-Adaptation of Selection Pressure andPopulation Size Possible? – a Case Study

A.E. Eiben M.C. Schut A.R. de Wilde

Self-adapting selection operators and population size can yield as good or better results than self-adapting operators.

Selection parameters are

encoded in individuals, and a

consensus value is reached.

Self-adapting selection increases

speed.

21

Zitzler, E., Laumanns, M., and Thiele, L. (2001). SPEA2: Improving the strengthpareto evolutionary algorithm.

22

Solving Multi-Objective Optimisation Problems Using the Potential Pareto Regions EvolutionaryAlgorithm

Nasreddine Hallam, Graham Kendall and Peter Blanchfield

Introduces Potential Pareto Regions Evolutionary Algorithm.

The fitness of an individual is equal to

the least improvement needed by that

individual in order to reach a non-

dominated status.

23

K. Deb, L. Thiele, M. Laumanns, and E. Zitzler (2002). Scalable multi-objective optimization test

problems.

24

Pareto Set and EMOA Behavior for SimpleMultimodal Multiobjective Functions

Mike Preuss, Boris Naujoks, and Günter Rudolph

Studies the often-disregarded Pareto set.

Changes induced in Pareto set alter the

ability of algorithms to track Pareto Front.

A measure of the quality of solution in the

solution space is needed.

Similar to S-metric in objective space.

25

About Selecting the Personal Bestin Multi-objective Particle Swarm Optimization

Jürgen Branke and Sanaz Mostaghim

Selecting a good guide bodes well for the future of a particle.

But they can also memorize all non-

dominated personal best solutions.

Keeping a personal archive yields

better results than traditional

methods.

26

27

A Particle Swarm Optimizer forConstrained Numerical Optimization

Cagnina, Esquivel, Coello

Uses a single method to handle all constraints

Results reported using standard test

functions.

28

Modelling Group-ForagingBehaviour with Particle Swarms

Cecilia Di Chio, Riccardo Poli, and Paolo Di Chio

Uses a nature-inspired technique to model a naturla problem: group foraging.

Results encouraging.

29

An evolutive approach for the delineation of locallabour markets

Florez, Casado, Martinez-Bernabeu

Using a GA for delineating local labour markets in Valencia, Spain.

Uses a bunch of operators:

mutation,crossover.

Better results than classical algorithms

30

That's all

Thank you and enjoy the session