CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory

15
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory Dr. Kenneth Stanley January 25, 2006

description

CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory. Dr. Kenneth Stanley January 25, 2006. Schema Theory (Holland 1975). A building block is a set of genes with good values Schemas are a formalization of buildings blocks - PowerPoint PPT Presentation

Transcript of CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory

Page 1: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

CAP6938Neuroevolution and Artificial Embryogeny

Evolutionary Computation Theory

Dr. Kenneth Stanley

January 25, 2006

Page 2: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Schema Theory (Holland 1975)

• A building block is a set of genes with good values

• Schemas are a formalization of buildings blocks• Schemas are bit strings with *’s (wildcards)• 1****0 is all 6-bit strings surrounded by 1 and 0• Order 2: 2 defined bits• A schema defines a hyperplane• Example: 1**

1

Page 3: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Schema Fitness

• GA implicitly evaluates fitness for all its schemas• Average fitness of a schema is average fitness

of all possible instances of it• A GA behaves as if it were really storing these

averages

Page 4: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Schema Theorem on Selection

• Idea: Calculate approximate dynamics of increase and decreases of schema instances

• Instances of H at time t: • Observed avg. fitness of H at time t:• Goal: Calculate• Using the fact that number of offspring is proportional to fitness:

• Thus, increases or decreases in instances depends on schema average fitness

tHm , tHu ,ˆ

1, tHmE

),()(

),(ˆ

)(

)(1, tHm

tf

tHu

tf

xftHmE

Hx

tHm

xf

tHu Hx

,

)(

Page 5: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Schema Theorem with Crossoverand Mutation

• Question is what the probability is that schema H will survive a crossover or mutation

• Let d(H) be H’s defining length• Probability that schema H will survive crossover:

• Equation shows it’s higher for shorter schemas• Probability of surviving mutation:

11

l

HdpHS cc string a to applying crossover

point single ofy probabilit the is cp

)(1 Homm pHS H in bits defined of number the is

mutating bit single a ofy probabilit the is

)(Ho

pm

Page 6: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Total Schema Theorem

• The expected number of instances of schema H taking into account selection, crossover, and mutation:

• Meaning: Low-order schemas whose average fitness remains above the mean will increase exponentially.

• Reason: Increase of non-disrupted schema proportional to

)()1(1

1),()(

),(ˆ1, Ho

mc pl

HdptHm

tf

tHutHmE

)(

),(ˆ

tf

tHu

Page 7: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Building Blocks Hypothesis(Goldberg 1989)

• Crossover combines good schema into equally good or better higher-order schema

• That is, crossover (or mutation) is not just destructive: It is a power behind the GA

Page 8: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Questioning the BBH

• Why would separately discovered building blocks be compatible?

• What about speciation?

• Hyrbridization is rare in nature

• Gradual elaboration is safer

• Schema Theorem and BBH

assume fixed length genomes

Page 9: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

No Free Lunch Theorem (Wolpert and Macready 1996)

• An attack on GAs and “black box” optimization• Across all possible problems, no optimization

method is better than any other• “Elevated performance over one class of

problems is exactly paid for in performance over another class.”

• Implication: Your method is not the best• Or is it?

Page 10: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Hill Climbing vs. Hill Descending

• Isn’t hill climbing better overall? No

Page 11: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Very Bad News

• “If an algorithm performs better than random search on some class of problems then it must perform worse than random search on the remaining problems.”

• “One should be weary of trying to generalize [previously obtained]results to other problems.”

• “If the practicioner has knowledge of problem characteristics but does not incorporate them into the optimization algorithm…there are no formal assurances that the algorithm chosen will be at all effective.”

Page 12: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Hope is not Lost

• An algorithm can be better over a class of problems if it exploits a common property of that class

• What is the class of problems known as the “real world?”

• Characterizing a class has become important

Page 13: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Function Approximation is a Subclass of Optimization

• A function approximator can be estimated• Estimation means described in fewer dimensions

(parameters) than the final solution • That is not true of optimization in general

• f(x) can be as simple or as complex as we want• There may be a limit on the # of bits in f(x), i.e. the size of

the memory, but we can use them however we want

fitnessxxxx n ...,,, 321 :onOptimizati

niiii ...,,, 321

noooo ...,,, 321

)(xffitness

:ionApproximat

Page 14: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Free Lunch for Approximation?

• How can the structure of approximation problems be exploited?– Start with simple approximations– Complexify them gradually– Information about a function can be elaborated– Information is not accumulated in general optimization

• Does this approach equate to a free lunch?• Neural networks are approximators• Real world problems are often approximation

problems: Does this escape NFL?

Page 15: CAP6938 Neuroevolution and  Artificial Embryogeny Evolutionary Computation Theory

Next Week: Neuroevolution (NE)

• Combining EC with neural networks

• Fixed-topology NE and TWEANNs

• The Competing Conventions Problem

Genetic Algorithms and Neural Networks by Darrell Whitley (1995) Evolving Artificial Neural Networks by Xin Yao (1999)Genetic set recombination and its application to neural network topology optimisation by Radcliffe, N. J. (1993). (Skim from section 4 on, except for 9.2)