BAYESIAN INFERENCE Sampling techniques Andreas Steingötter.

42
BAYESIAN INFERENCE Sampling techniques Andreas Steingötter

Transcript of BAYESIAN INFERENCE Sampling techniques Andreas Steingötter.

  • Slide 1
  • BAYESIAN INFERENCE Sampling techniques Andreas Steingtter
  • Slide 2
  • Motivation & Background Exact inference is intractable, so we have to resort to some form of approximation
  • Slide 3
  • Motivation & Background variational Bayes deterministic approximation not exact in principle Alternative approximation: Perform inference by numerical sampling, also known as Monte Carlo techniques.
  • Slide 4
  • Motivation & Background
  • Slide 5
  • Classical Monte Carlo approx approximation
  • Slide 6
  • Motivation & Background
  • Slide 7
  • How to do sampling? 1.Basic Sampling algorithms Restricted mainly to 1- / 2- dimensional problems 2.Markov chain Monte Carlo Very general and powerful framework
  • Slide 8
  • Basic sampling
  • Slide 9
  • Random sampling Computers can generate only pseudorandom numbers Correlation of successive values Lack of uniformity of distribution Poor dimensional distribution of output sequence Distance between where certain values occur are distributed differently from those in a random sequence distribution
  • Slide 10
  • Random sampling from the Uniform Distribution Assumption: good pseudo-random generator for uniformly distributed data is implemented Alternative: http://www.random.org http://www.random.org true random numbers with randomness coming from atmospheric noise
  • Slide 11
  • Random sampling from a standard non-uniform distribution
  • Slide 12
  • Slide 13
  • Rejection sampling
  • Slide 14
  • Slide 15
  • Slide 16
  • Adaptive rejection sampling
  • Slide 17
  • Slope Offset k
  • Slide 18
  • Adaptive rejection sampling
  • Slide 19
  • Importance sampling
  • Slide 20
  • Slide 21
  • Slide 22
  • Slide 23
  • Markov Chain Monte Carlo (MCMC) sampling
  • Slide 24
  • Slide 25
  • MCMC - Metropolis algorithm
  • Slide 26
  • Slide 27
  • Metropolis algorithm
  • Slide 28
  • Examples: Metropolis algorithm Implementation in R : Elliptical distibution
  • Slide 29
  • Examples: Metropolis algorithm Implementation in R : Initialization [-2,2], step size = 0.3 n=1500n=15000
  • Slide 30
  • Examples: Metropolis algorithm Implementation in R : Initialization [-2,2], step size = 0.5 n=1500 n=15000
  • Slide 31
  • Examples: Metropolis algorithm Implementation in R : Initialization [-2,2], step size = 1 n=1500 n=15000
  • Slide 32
  • Validation of MCMC homogeneous z (1) z (2) z (m) z (m+1) Invariant (stationary)
  • Slide 33
  • Validation of MCMC homogeneous detailed balance Invariant (stationary) Sufficient reversible
  • Slide 34
  • Validation of MCMC ergodicity invariant
  • Slide 35
  • Properties and validation of MCMC k - Mixing coefficients
  • Slide 36
  • Metropolis-Hastings algorithm If symmetry
  • Slide 37
  • Metropolis-Hastings algorithm Gaussian centered on current state Small variance -> high acceptance, slow walk, dependent samples Large variance -> high rejection rate
  • Slide 38
  • Gibbs sampling repeated by cycling randomly choose variable to be updated
  • Slide 39
  • Gibbs sampling
  • Slide 40
  • z (1) z (2) z (3)
  • Slide 41
  • Gibbs sampling Obtain m independent samples: 1.Sample MCMC during a burn-in period to remove dependence on initial values 2.Then, sample at set time points (e.g. every M th sample) The Gibbs sequence converges to a stationary (equilibrium) distribution that is independent of the starting values, By construction this stationary distribution is the target distribution we are trying to simulate.
  • Slide 42
  • Gibbs sampling