Combinatorial and algebraic tools for multigrid

Post on 19-Jan-2016

56 views 0 download

description

Combinatorial and algebraic tools for multigrid. Yiannis Koutis Computer Science Department Carnegie Mellon University. multilevel methods. www.mgnet.org 3500 citations 25 free software packages 10 special conferences since 1983 Algorithms not always working - PowerPoint PPT Presentation

Transcript of Combinatorial and algebraic tools for multigrid

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Combinatorial and algebraic tools for

multigridYiannis Koutis

Computer Science DepartmentCarnegie Mellon University

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

multilevel methods

www.mgnet.org• 3500 citations• 25 free software packages• 10 special conferences since 1983

Algorithms not always workingLimited theoretical understanding

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

multilevel methods: our goals

• provide theoretical understanding• solve multilevel design problems• small changes in current software

• study structure of eigenspaces of Laplacians

• extensions for multilevel eigensolvers

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

quick definitions

• Given a graph G, with weights wij

• Laplacian: A(i,j) = -wij, row sums =0

• Normalized Laplacian:

• (A,B) is a measure of how well B approximates A (and vice-versa)

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

linear systems : preconditioning

• Goal: Solve Ax = b via an iterative method

• A is a Laplacian of size n with m edges. Complexity depends on (A,I) and m

• Solution: Solve B-1Ax = B-1b• Bz=y must be easily solvable• (A,B) is small• B is the preconditioner

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

combinatorial preconditionersthe Vaidya thread

• B is a sparse subgraph of A, possibly with additional edges

Solving Bz=y is performed as follows:1. Gaussian elimination on degree ·2 nodes

of B2. A new system must be solved 3. Recursively call the same algorithm on

to get an approximate solution.

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

combinatorial preconditionersthe Vaidya thread

• Graph Sparsification [Spielman, Teng]• Low stretch trees [Elkin, Emek,

Spielman, Teng]• Near optimal O(m poly( log n))

complexity

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

combinatorial preconditionersthe Vaidya thread

• Graph Sparsification [Spielman, Teng]• Low stretch trees [Elkin, Emek, Spielman,

Teng]• Near optimal O(m poly( log n)) complexity

• Focus on constructing a good B• (A,B) is well understood – B is sparser than

A• B can look complicated even for simple

graphs A

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

combinatorial preconditioners

the Gremban - Miller thread• the support graph S is bigger than A

1 12 3 1 2 2 1 1

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

combinatorial preconditioners

the Gremban - Miller thread• the support graph S is bigger than A

2

12 3 1 2 2 1 1

1 12 3 1 2 2 1 1

3 2 1

1 3 3 4 4 3 4 3 2 1

Quotient

1

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

combinatorial preconditioners

the Gremban - Miller thread• The preconditioner S is often a

natural graph • S inherits the sparsity properties of A• S is equivalent to a dense graph B of

size equal to that of A : (A,S) = (A,B)• Analysis of (A,S) made easy by work of

[Maggs, Miller, Ravi, Woo, Parekh]

• Existence of good S by work of [Racke]

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions• Other results

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

• Suppose we are given m clusters in A

• R(i,j) = 1 if the jth cluster contains node i

• R is n x m • Quotient

• R is the clustering matrix

algebraic expressions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

• The inverse preconditioner

• The normalized version

• RT D1/2 is the weighted clustering matrix

algebraic expressions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions• Other results

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

good partitions and low frequency invariant

subspaces• Suppose the graph A has a good

clustering defined by the clustering matrix R

• Let• Let y be any vector such that

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

• Suppose the graph A has a good clustering defined by the clustering matrix R

• Let• Let y be any vector such that

Theorem: The inequality is tight up to a constant for

certain graphs

good partitions and low frequency invariant

subspaces

quality

test?

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

good partitions and low frequency invariant

subspaces• Let y be any vector such that • Let x be mostly a linear combination of

eigenvectors corresponding to eigenvalues close to

Theorem: • Prove ?• We can find random vector x and check

the distance to the closest y

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

multigrid – short introduction

• General class of algorithms

• Richardson iteration:

• High frequency components are reduced:

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

initial and smoothed error

initial error smoothed error

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

• Define a smaller graph Q• Define a projection operator

Rproject• Define a lift operator Rlift

the basic multigrid algorithm

1. Apply t rounds of smoothing 2. Take the residual r = b-Axold

3. Solve Qz = Rprojectr4. Form new iterate xnew = xold + Rlift z

5. Apply t rounds of smoothing

how many? which

iteration ?

recursion

is this needed ?

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

algebraic multigrid (AMG)

Goals: The range of Rproject must approximate the unreduced error very well. The error not reduced by smoothing must be reduced by the smaller grid.

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

algebraic multigrid (AMG)Goals: The range of Rproject must approximate

the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid.

• Jacobi iteration: • or ‘scaled’ Richardson:• Find a clustering • Rproject = (Rlift)T

• Q = RprojectT A Rproject

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

algebraic multigrid (AMG)Goals: The range of Rproject must approximate

the unreduced error very well. The error not reduced by smoothing must be reduced in the smaller grid.

• Jacobi iteration: • or ‘scaled’ Richardson• Find a clustering [heuristic]• Rproject = (Rlift)T [heuristic]

• Q = RprojectT A Rproject

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

two level analysis

• Analyze the maximum eigenvalue of

• where

• The matrix T1 eliminates the error in

• A low frequency eigenvector has a significant component in

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

two level analysis

• Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than . Let Y be the null space of Rproject. Assume, <X,Y>2 · /

• Two level convergence : error reduced by

• Proving the hypothesis ? Limited cases

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

current state

‘there is no systematic AMG approach that has proven effective in any kind of general context’

[BCFHJMMR, SIAM Journal on Scientific Computing, 2003]

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

our contributions – two level

• There exists a good clustering given by R. The quality is measured by the condition number (A,S)

• Q = RT A R• Richardson’s with

• Projection matrix

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

our contributions - two level analysis

• Starting hypothesis: Let X be the subspace corresponding to eigenvalues smaller than . Let Y be the null space of Rproject = RTD1/2 Assume, <X,Y>2 · /

• Two level convergence : error reduced by • Proving the hypothesis ? Yes! Using (A,S)• Result holds for t=1 smoothing• Additional smoothings do not help

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

our contributions - recursion

• There is a matrix M which characterizes the error reduction after one full multigrid cycle

• We need to upper bound its maximum eigenvalue as a function of the two-level eigenvalues

the maximum eigenvalue of M is upper bounded by the sum of the maximum

eigenvalues over all two-levels

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

towards full convergence

• Goal: The error not reduced by smoothing must be reduced by the smaller grid

A different point of viewThe small grid does not reduce part

of the error. It rather changes its spectral profile.

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

full convergence for regular d-dimensional toroidal

meshes• A simple change in the

implementation of the algorithm:

• where

• T2 has eigenvalues 1 and -1

• T2 xlow = xhigh

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

full convergence for regular d-dimensional toroidal

meshes• With t=O(log log n) smoothings

• Using recursive analysis: max(M) · 1/2

• Both pre-smoothings and post-smoothings are needed

• Holds for perturbations of toroidal meshes

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Overview

• Quick definitions• Subgraph preconditioners• Support graph preconditioners• Algebraic expressions• Low frequency eigenvectors and good

partitionings• Multigrid introduction and current state • Multigrid – Our contributions

05/11/2005

Carnegie Mellon School of Computer Science

Aladdin Lamps 05

Thanks!