Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood...

30
Data Mining Driven Neighborhood Search Michele Samorani Manuel Laguna

Transcript of Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood...

Page 1: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Data Mining Driven Neighborhood Search

Michele SamoraniManuel Laguna

Page 2: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• PROBLEM: In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the search has to escape from it– Tabu Search uses tabu lists and other strategies– Other variations are: Path Relinking, Randomization, Restarting

• FACT:Escape directions are set by a-priori rules

Problem

Page 3: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Goal of this work

Class of problems

1. Take a few instances2. Consider the local optima3. Learn

4. Given another instance5. Use the knowledge to tackle

it by using “smart” constraints

General framework forany class of problems

Page 4: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• How to learn the constraints• How to apply them• Results• Conclusions

Outline

Page 5: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

HOW TO LEARN THE CONSTRAINTS

Page 6: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• Collect many local optima from instances of the same class of problems

• For each local optimum Ai, consider the local optima nearby, Bk ( k in N(i) ), forming pairs

(Ai, Bk)

Denote each pair with ‘-’ if the objective function improves from Ai to Bk, ‘+’ otherwise

How to learn the constraints

Page 7: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

A1

A2

A3

A4

B1

B2

B3

B4

B5

B6

B7

B8

B9

B10

B11

B12

--

-

+

+

+

+

++

+--

How to learn the constraints

Page 8: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• Constrained Task Allocation Problem (CTAP):– Assign m tasks to n CPUs minimizing the total

cost• Costs:

– Fixed cost: if CPU j is used, we pay Sj

– Communication cost: if tasks p and q are in different CPUs, we pay c(p,q)

Example: CTAP

Page 9: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• Suppose S2 > S3. Consider this move:

• This move is unlikely to be performed because:1. We would introduce the fixed cost S3

2. We would introduce communication

cost c5,6

CPU1

T1,T2, T3, T4

CPU2

T5, T6

CPU3 CPU1

T1,T2, T3, T4

CPU2

T5

CPU3

T6

Example: CTAP

Page 10: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• Suppose S2 > S3. Consider this move:

• But at the next move, we could move T5 too

• We want to learn rules like:

“if there is an empty CPU y that can accommodate the tasks assigned to CPU x, and it has a smaller fixed cost,

move the tasks from x to y”

CPU1

T1,T2, T3, T4

CPU2

T5, T6

CPU3 CPU1

T1,T2, T3, T4

CPU2

T5

CPU3

T6

Condition on local optimum

Condition on pair of local optima

Example: CTAP

Page 11: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• A Rule Rt is a pair of conditions (Ft, Gt)

Rt has to be applied at a local optimum L

and has the following form :“If L satisfies condition Ft, then go towards a

solution S such that (L, S) satisfies condition Gt”

How to learn the constraints

Page 12: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

A1

A2

A3

A4

B1

B2

B3

B4

B5

B6

B7

B8

B9

B10

B11

B12

--

-

+

+

+

+

++

+--

Finding a rule Rt = finding a subset of initial local optima such that:1. they can be distinguished

from the other local optima through condition Ft

2. If F is satisfied, then there are ‘-’ pairs satisfying condition Gt

Ft = 1Gt = 1

Gt = 1

Gt = 1

How to learn the constraints

Page 13: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

A1

A2

A3

A4

B1

B2

B3

B4

B5

B6

B7

B8

B9

B10

B11

B12

--

-

+

+

+

+

++

+--

Ft = 1Gt = 1

Gt = 1

Gt = 1

Mathematical Model for 1 rule

Constraints on F and G:F (or G)=1 and F (or G)=0 must be the outputs of a binary classifier

Page 14: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

USE THE CONSTRAINTS TO ENHANCE THE SEARCH

Page 15: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

F G

F1 G1

F2 G2

… …

If L satisfies condition Ft, then go towards a solution S such that (L, S) satisfies condition Gt

Output of learning

Page 16: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Local optimum L satisfies Ft

ESCAPE Tabu Search

Set O.F: max Gt (L, S)

Gt < 0

EXPLORATION Tabu Search

Set O.F: Real O.F.

Set constraintGt (L, S) > 0

1. Value(S) ≥ Value(L)2. Step < maxSteps

We can’t satisfy Gt after maxSteps

maxSteps reached

Value(S) < Value(L)

Unsuccessful escape

Successful escape

Enforcing the constraints

Page 17: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

EXPERIMENTS AND RESULTS

Page 18: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

• 108 instances of CTAP – A.Lusa, CN Potts (2008)

Problem Set

Page 19: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

“Which between tabu search and smart escape is more effective to escape from a local optimum?”

• For 30 times:– Find 1 rule (F1, G1) using 30% of the local optima

– For each local optimum L of the remaining local optima (70%):• Try to escape from L using:

– smart constraints– a simple tabu search

• And see in which “valley” we are through a local search

Experiment 1 – better LO?

L

M

Page 20: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Accuracy = 81.58%

Experiment 1 – Results

Page 21: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Experiment 1 – Results

Page 22: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Experiment 1 – Results

• Which one yields the greatest improvement

from the initial local optimum to the final local optimum?

Page 23: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Experiment 2 – better search?• Compare the following:

– GRASP + tabu search (max not improving moves = m)

– GRASP + smart search (with 1 or with 2 rules)

• Whenever you find a local optimum:– If a suitable rule is available, apply the

corresponding constraint with maxSteps = m– Otherwise, run a tabu search (max moves = m)

• Run for 50 times on 72 instances, and record the best solution found

Page 24: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Experiment 2 – Results

Comparison to Tabu Search

New Best Known Solutions

Page 25: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Additional experiments on the Matrix Bandwidth Minimization Problem• This problem is equivalent to labeling the vertices of an

undirected graph so that the maximum difference between the labels of any pair of adjacent vertices is minimized

• We considered the data set of Martí et al. (2001), which is composed by 126 instances

• A simple tabu search performs well on 115 instances, and poorly on 11

• We used 30 of the easy instances as training set and the 11 hard ones as test set

• For 50 times:– For each test instance:

• Generate a random solution.• Run a regular Tabu Search• Run a Smart Tabu Search (Data Mining Driven Tabu Search – DMDTS)• Record the number of wins, ties, losses

Page 26: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Additional experiments on the Matrix Bandwidth Minimization Problem

Wins - losses

7 wins, 1 loss, 3 ties

Page 27: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

CONCLUSIONS

Page 28: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Conclusions

• We showed that:– It is possible to learn offline from other

instances of the same class of problems– It is possible to effectively exploit this

knowledge by dynamically introducing guiding constraints during a tabu search

Page 29: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Research Opportunities

• Improve learning part (heuristic algorithm)• Improve constraints enforcement• Apply this idea to other neighborhood

searches• Explore the potential of this idea on other

problems

Page 30: Michele Samorani Manuel Laguna. PROBLEM : In meta-heuristic methods that are based on neighborhood search, whenever a local optimum is encountered, the.

Thank you for your attention