A novel hybridization of opposition-based learning and cooperative co-evolutionary for large-scale...

31
A Novel Hybridization of Opposition-Based Learning and Cooperative Co-evolutionary for Large-Scale Optimization Borhan Kazimipour Mohammad Nabi Omidvar Xiaodong Li A.K. Qin

Transcript of A novel hybridization of opposition-based learning and cooperative co-evolutionary for large-scale...

A Novel Hybridization of

Opposition -Based Learning and

Cooperative Co -evolutionary for

Large -Scale OptimizationBorhan KazimipourMohammad Nabi OmidvarXiaodong LiA.K. Qin

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 2Hybridization of OBL and CC for LSO

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 3Hybridization of OBL and CC for LSO

Research Question

CEC 2014, Beijing, China 4Hybridization of OBL and CC for LSO

How to use both opposition concept and divide and conquer approach to tackle the

curse of dimensionality in optimization?

Large Scale Optimization

Divide and Conquer

Opposite Population

Contributions

• In this work…– We propose a general framework to use opposition-based techniques in

cooperative co-evolutionary

– We implement a very simple, yet effective, example of the proposed framework

– We empirically show the effectiveness of the proposed framework even in the simplest form

– We provide several advanced statistics to support our findings

CEC 2014, Beijing, China 5Hybridization of OBL and CC for LSO

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 6Hybridization of OBL and CC for LSO

Large Scale Optimization

CEC 2014, Beijing, China 7Hybridization of OBL and CC for LSO

Dealing with Dimensionality

Simplify Problems• Cooperative Co-evolutionary

1. Divide the high dimensional problem into smaller sub-problems

2. Solve each sub-problem almost separately

3. Merge the sub-solutions to find the solution of the original problem

Improve Solutions• Opposition-based Learning

1. Calculate the opposite of the potential solutions

2. Merge both original and oppositional populations of solutions

3. Select the best subset of the merged population and evolve it.

CEC 2014, Beijing, China Hybridization of OBL and CC for LSO 8

Cooperative Co -evolutionary (CC)

• Grouping or Decomposition – Grouping is the process of dividing big given problem to small sub-problems (a.k.a.

groups or subcomponents)

– Interacting (non-separable) variables are better to be grouped in the same group

– Non-interacting (separable) variables are better to be grouped in different groups

– In black-box problems, we may have no information about the interactions between variables

• Decomposition Methods– Random Grouping

– Delta Grouping

– Variable Interaction Learning

– Differential Grouping

Note: Decomposition methods in CC framework are entirely independent of the optimizer

CEC 2014, Beijing, China 9Hybridization of OBL and CC for LSO

Cooperative Co -evolutionary (CC)

CEC 2014, Beijing, China 10Hybridization of OBL and CC for LSO

Opposition -Based Learning (OBL)

• Core Theorems1. A candidate solution and its opposite have the equal probability of being closer

to the global optimum.

2. The probability of the opposite of a candidate solution being closer to the global optimum is higher than the probability of a second random guess being closer to the global.

• Opposition Techniques– Opposition-Base Learning (OBL)

– Quasi-Opposition-Base Learning (QOBL)

– Quasi-Reflection Opposition-Base Learning (QROBL)

– Generalized Opposition-Base Learning (GOBL)

– Current Optimum Opposition-Base Learning (COOBL)

Note: All OBL techniques are entirely independent of the optimizer

CEC 2014, Beijing, China 11Hybridization of OBL and CC for LSO

Opposition -Based Learning (OBL)

• Computing Opposite Points

• Graphical Examples

CEC 2014, Beijing, China 12Hybridization of OBL and CC for LSO

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 13Hybridization of OBL and CC for LSO

Proposed Framework

CEC 2014, Beijing, China 14Hybridization of OBL and CC for LSO

Proposed Framework

CEC 2014, Beijing, China 15Hybridization of OBL and CC for LSO

Proposed Framework

• Generality of the proposed framework:

– Any kind of grouping technique (e.g., static, random, delta, differential, ideal, etc.) can be employed

– Any kind of optimizer (e.g., DE, PSO, GA, ABC, etc.)

– Any kind of opposition operator (e.g., OBL, QOBL, QROBL, GOBL, etc. ) can be used.

CEC 2014, Beijing, China 16Hybridization of OBL and CC for LSO

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 17Hybridization of OBL and CC for LSO

Experiments Setup

• Benchmark– CEC 2013 LSGO Benchmarks

– 15 functions

– 1000 dimensions

– Categories

1. fully separable functions (f1 - f3),

2. partially separable functions with a separable subcomponent (f4 - f7),

3. partially separable functions with no separable subcomponents (f8 - f11),

4. overlapping functions (f12 - f14),

5. fully non-separable function (f15).

• Statistical Tests– Iman and Davenport (a.k.a. Friedman rank) test is used for ranking

– Li post-hoc procedure is used as significance test

CEC 2014, Beijing, China 18Hybridization of OBL and CC for LSO

Experiments Setup

• Implementations– Hybridisation of OBL and CC (with random grouping) called OBL-CC

– Hybridisation of QOBL and CC (with random grouping) called QOBL-CC

– DECC is used as the control method (as suggested in CEC 2013 LSGO benchmark report)

– Self-adaptive DE with Neighbourhood Selection (SaNSDE) used as optimizer (as suggested in CEC 2013 LSGO benchmark report)

• Parameter Values– Four Jumping Rate (the probability of applying OBL/QOBL operators) schemes are

used

– Fixed (0.3)

– Fixed (0.6)

– Monotonically increasing (0~0.6)

– Monotonically decreasing (0.6~0)

– Max FE = 3,000,000

– 51 independent runs

CEC 2014, Beijing, China 19Hybridization of OBL and CC for LSO

Experiments ResultsSimple Statistics

CEC 2014, Beijing, China 20Hybridization of OBL and CC for LSO

Experiments ResultsWin-Draw -Loss Table

• Control Method: DECC

• Statistics:– Two algorithms are considered to be significantly different if the p-value of Wilcoxon rank-sum

test is less than 0.05, and statistically similar otherwise

CEC 2014, Beijing, China 21Hybridization of OBL and CC for LSO

Experiments ResultsWin-Draw -Loss Findings

• Except for OBLCC(0.6) and OBL-CC(0.6~0), the hybridization generally improves the performance of DECC.

• QOBL-CC(0.6~0) with 9 wins shows the best performance

• OBL-CC(0.3) with 1 loss is the most reliable hybridization (i.e., has the least risk of failure)

• The family of QOBL-CC is more effective in dealing with problems with some degrees of non-separability (G2-G5)

• The family of QOBL-CC is less recommended to be used in dealing with fully separable problems (G1).

CEC 2014, Beijing, China 22Hybridization of OBL and CC for LSO

Experiments ResultsnWins Table

nWins Score = # wins - # losses in an N X N comparison (Wilcoxon rank-sum )

CEC 2014, Beijing, China 23Hybridization of OBL and CC for LSO

Experiments ResultsnWins Findings

• All hybrid methods obtain better nWins scores than DECC (success of hybridization)

• DECC is never the single best method among the others.

• The last row of Table III confirms that DECC is the weakest algorithm amongst the others.

• QOBL-CC family is the best one, overally.

• QOBL-CC methods show weak performance in dealing with fully separablefunctions (G1).

• For G2-G5, the best performer is always from QOBL-CC family

CEC 2014, Beijing, China 24Hybridization of OBL and CC for LSO

Experiments ResultsFriedman Ranking

• Friedman Test – Is one of the state-of-the-art algorithms

for 1XN and NXN comparisons

– To avoid family-wise error rate (FWER)

– To have a strong conclusion

• Compatibility

– Findings from Friedman ranking confirm all previous findings (win-draw-loss and nWins).

CEC 2014, Beijing, China 25Hybridization of OBL and CC for LSO

Experiments ResultsFriedman Ranking Findings

• In f1, DECC shares the first position with OBL-CC

• In f5 and f10 all algorithms performs statistically similar .

• In all cases, except f1 and f5, hybridizationprovide significant improvement .

• QOBL-CC is not as effective as the other hybrid methods in dealing with fully separable problems (G1)

• On partially separable and non-separable functions (G2-G5), QOBL-CCs are always the best performers.

• DECC is in general the worst algorithm.

• QOBL-CC(0~0.6) is the top performer.

CEC 2014, Beijing, China 26Hybridization of OBL and CC for LSO

Experiments ResultsLi Post-hoc

• Li Post-hoc– Is used to compare all methods against

the control method (DECC).

– Helps to investigate if a method is statistically significantly better than another.

– Yes: Best method(s) is significantly better than DECC

– No: Best method(s) in NOT significantly better than DECC (statistically similar)

CEC 2014, Beijing, China 27Hybridization of OBL and CC for LSO

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 28Hybridization of OBL and CC for LSO

Future Work

• Extension– Comparing different opposition strategies, decomposition techniques and core

optimizers.

• Sensitivity Analysis– Studying influencing parameters (e.g., Jr value)

• Comparison– Comparing the performance of the framework with the state-of-the-art methods in

LSO

CEC 2014, Beijing, China 29Hybridization of OBL and CC for LSO

Outlines

1. Introduction

2. Background

3. Proposed Framework

4. Experiments

5. Future Work

6. Questions

CEC 2014, Beijing, China 30Hybridization of OBL and CC for LSO

Thank you☺☺☺☺

Any question or comment?

31CEC 2014, Beijing, China Hybridization of OBL and CC for LSO