Ant Colony Optimization and the Minimum Cut Problem · Ant Colony Optimization (ACO) I We want to...
Transcript of Ant Colony Optimization and the Minimum Cut Problem · Ant Colony Optimization (ACO) I We want to...
Ant Colony Optimization and the Minimum CutProblem
Timo Kotzing, Per Kristian Lehre,Frank Neumann, Pietro S. Oliveto
March 25, 2010
Ant Colony Optimization (ACO)
I We want to analyze the use of Ant Colony Optimization(ACO) for the Minimum Cut Problem.
I As input, the ACO algorithm gets an weighted undirectedgraph G on n vertices.
I The ACO algorithm iteratively computes partitions of G ’svertices into two non-empty sets, one per iteration.
I The algorithm keeps track of the best so far candidatesolution.
I We analyze the random variable of the number of iterationsrequired until an optimal solution is found.
2/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Ant Colony Optimization (ACO)
I We want to analyze the use of Ant Colony Optimization(ACO) for the Minimum Cut Problem.
I As input, the ACO algorithm gets an weighted undirectedgraph G on n vertices.
I The ACO algorithm iteratively computes partitions of G ’svertices into two non-empty sets, one per iteration.
I The algorithm keeps track of the best so far candidatesolution.
I We analyze the random variable of the number of iterationsrequired until an optimal solution is found.
2/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Ant Colony Optimization (ACO)
I We want to analyze the use of Ant Colony Optimization(ACO) for the Minimum Cut Problem.
I As input, the ACO algorithm gets an weighted undirectedgraph G on n vertices.
I The ACO algorithm iteratively computes partitions of G ’svertices into two non-empty sets, one per iteration.
I The algorithm keeps track of the best so far candidatesolution.
I We analyze the random variable of the number of iterationsrequired until an optimal solution is found.
2/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Ant Colony Optimization (ACO)
I We want to analyze the use of Ant Colony Optimization(ACO) for the Minimum Cut Problem.
I As input, the ACO algorithm gets an weighted undirectedgraph G on n vertices.
I The ACO algorithm iteratively computes partitions of G ’svertices into two non-empty sets, one per iteration.
I The algorithm keeps track of the best so far candidatesolution.
I We analyze the random variable of the number of iterationsrequired until an optimal solution is found.
2/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Ant Colony Optimization (ACO)
I We want to analyze the use of Ant Colony Optimization(ACO) for the Minimum Cut Problem.
I As input, the ACO algorithm gets an weighted undirectedgraph G on n vertices.
I The ACO algorithm iteratively computes partitions of G ’svertices into two non-empty sets, one per iteration.
I The algorithm keeps track of the best so far candidatesolution.
I We analyze the random variable of the number of iterationsrequired until an optimal solution is found.
2/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Ant Colony Optimization (ACO)
I We want to analyze the use of Ant Colony Optimization(ACO) for the Minimum Cut Problem.
I As input, the ACO algorithm gets an weighted undirectedgraph G on n vertices.
I The ACO algorithm iteratively computes partitions of G ’svertices into two non-empty sets, one per iteration.
I The algorithm keeps track of the best so far candidatesolution.
I We analyze the random variable of the number of iterationsrequired until an optimal solution is found.
2/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Idea for Constructing Solutions
I Idea for Constructing Solutions (Karger and Stein):
I Any forest of n − 2 edges constitutes a partition into two sets(the sets of vertices of the two trees).
I Karger and Stein give an algorithm with expected runtimeO(n2).
I Our ACO algorithm lets ants choose (sequentially) n − 2edges to build candidate solutions (without creating cycles).
I The probability for an edge e to be picked depends on twovalue associated with that edge:
I its weight w(e); andI the pheromone value τe on e.
3/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Pheromones
I Pheromones are additional information on the edges.
I A higher pheromone value on an edge e means that e is morelikely to be chosen for the next solution.
I Initially, all pheromone values are the same.
I After that, the pheromone value of an edge e that is used inthe best-so-far solution has a pheromone value h.
I All others have a pheromone value l .
4/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Pheromones
I Pheromones are additional information on the edges.
I A higher pheromone value on an edge e means that e is morelikely to be chosen for the next solution.
I Initially, all pheromone values are the same.
I After that, the pheromone value of an edge e that is used inthe best-so-far solution has a pheromone value h.
I All others have a pheromone value l .
4/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Pheromones
I Pheromones are additional information on the edges.
I A higher pheromone value on an edge e means that e is morelikely to be chosen for the next solution.
I Initially, all pheromone values are the same.
I After that, the pheromone value of an edge e that is used inthe best-so-far solution has a pheromone value h.
I All others have a pheromone value l .
4/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Pheromones
I Pheromones are additional information on the edges.
I A higher pheromone value on an edge e means that e is morelikely to be chosen for the next solution.
I Initially, all pheromone values are the same.
I After that, the pheromone value of an edge e that is used inthe best-so-far solution has a pheromone value h.
I All others have a pheromone value l .
4/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Pheromones
I Pheromones are additional information on the edges.
I A higher pheromone value on an edge e means that e is morelikely to be chosen for the next solution.
I Initially, all pheromone values are the same.
I After that, the pheromone value of an edge e that is used inthe best-so-far solution has a pheromone value h.
I All others have a pheromone value l .
4/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Pheromones
I Pheromones are additional information on the edges.
I A higher pheromone value on an edge e means that e is morelikely to be chosen for the next solution.
I Initially, all pheromone values are the same.
I After that, the pheromone value of an edge e that is used inthe best-so-far solution has a pheromone value h.
I All others have a pheromone value l .
4/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Heuristic Information vs. Pheromone Values
I Remember: The probability for an edge e to be pickeddepends on the two values w(e) and τe .
I How do we balance these two values?
I We use two parameters, α and β.
I For an edge e with associated pheromone value τe and weightw(e), the ant chooses e proportionally to
ταe · (w(e))β.
5/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Heuristic Information vs. Pheromone Values
I Remember: The probability for an edge e to be pickeddepends on the two values w(e) and τe .
I How do we balance these two values?
I We use two parameters, α and β.
I For an edge e with associated pheromone value τe and weightw(e), the ant chooses e proportionally to
ταe · (w(e))β.
5/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Heuristic Information vs. Pheromone Values
I Remember: The probability for an edge e to be pickeddepends on the two values w(e) and τe .
I How do we balance these two values?
I We use two parameters, α and β.
I For an edge e with associated pheromone value τe and weightw(e), the ant chooses e proportionally to
ταe · (w(e))β.
5/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Heuristic Information vs. Pheromone Values
I Remember: The probability for an edge e to be pickeddepends on the two values w(e) and τe .
I How do we balance these two values?
I We use two parameters, α and β.
I For an edge e with associated pheromone value τe and weightw(e), the ant chooses e proportionally to
ταe · (w(e))β.
5/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Heuristic Information vs. Pheromone Values
I Remember: The probability for an edge e to be pickeddepends on the two values w(e) and τe .
I How do we balance these two values?
I We use two parameters, α and β.
I For an edge e with associated pheromone value τe and weightw(e), the ant chooses e proportionally to
ταe · (w(e))β.
5/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Heuristic Information vs. Pheromone Values
I Remember: The probability for an edge e to be pickeddepends on the two values w(e) and τe .
I How do we balance these two values?
I We use two parameters, α and β.
I For an edge e with associated pheromone value τe and weightw(e), the ant chooses e proportionally to
ταe · (w(e))β.
5/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Results
ταe · (w(e))β
We proved the following expected optimization times of ACO.
I If α = 0 and β = 1 (greedy only), O(n2).
I If α = 1 and β = 1 with constant pheromone bounds, timesare still polynomially bounded.
I If α = 1 and β = 1 with at least linear pheromone boundratio, times are not polynomially bounded.
I If β > 1, times are not polynomially bounded.
I If α = 1 and β = 0 for sensible pheromone bounds, times aregain not polynomially bounded.
6/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Conclusions
I Don’t use an ACO algorithm to solve the Min-Cut Problem.
I ACO can simulate Karger and Stein’s algorithm.
I We now understand better how ACO algorithms work.
I We now understand better how to analyze ACO algorithms.
7/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Conclusions
I Don’t use an ACO algorithm to solve the Min-Cut Problem.
I ACO can simulate Karger and Stein’s algorithm.
I We now understand better how ACO algorithms work.
I We now understand better how to analyze ACO algorithms.
7/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Conclusions
I Don’t use an ACO algorithm to solve the Min-Cut Problem.
I ACO can simulate Karger and Stein’s algorithm.
I We now understand better how ACO algorithms work.
I We now understand better how to analyze ACO algorithms.
7/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Conclusions
I Don’t use an ACO algorithm to solve the Min-Cut Problem.
I ACO can simulate Karger and Stein’s algorithm.
I We now understand better how ACO algorithms work.
I We now understand better how to analyze ACO algorithms.
7/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Conclusions
I Don’t use an ACO algorithm to solve the Min-Cut Problem.
I ACO can simulate Karger and Stein’s algorithm.
I We now understand better how ACO algorithms work.
I We now understand better how to analyze ACO algorithms.
7/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut
Thank you.
8/8 Kotzing, Lehre, Neumann, Oliveto ACO and MinCut