16 Light

download 16 Light

of 58

Transcript of 16 Light

  • 7/29/2019 16 Light

    1/58

    Capturing Light

    Some slides from M. Agrawala, F. Durand, P. Debevec,

    A. Efros, R. Fergus, D. Forsyth, M. Levoy, and S. Seitz

  • 7/29/2019 16 Light

    2/58

    The Plenoptic Function

    Q: What is the set of all things that we can ever see?

    A: The Plenoptic Function (Adelson & Bergen)

    Lets start with a stationary person and try to

    parameterize everything that she can see

    Figure by Leonard McMillan

  • 7/29/2019 16 Light

    3/58

    Grayscale Snapshot

    is intensity of light

    Seen from a single viewpoint

    At a single time

    Averaged over the wavelengths of the visible

    spectrum

    P(q, f)

  • 7/29/2019 16 Light

    4/58

    Color Snapshot

    is intensity of light

    Seen from a single viewpoint

    At a single time

    As a function of wavelength

    P(q, f, l)

  • 7/29/2019 16 Light

    5/58

    A Movie

    is intensity of light

    Seen from a single viewpoint

    Over time

    As a function of wavelength

    P(q, f, l, t)

  • 7/29/2019 16 Light

    6/58

    Holographic Movie

    is intensity of light

    Seen from ANY viewpoint

    Over time

    As a function of wavelength

    P(q, f, l, t, VX, VY, VZ)

  • 7/29/2019 16 Light

    7/58

    The Plenoptic Function

    Can reconstruct every possible view, atevery moment, from every position, atevery wavelength

    Contains every photograph, every movie,

    everything that anyone has ever seen

    P(q, f, l, t, VX, VY, VZ)

  • 7/29/2019 16 Light

    8/58

    Sampling the Plenoptic Function

  • 7/29/2019 16 Light

    9/58

    Surface Camera

    Lighting

    Camera

    A camera is a device for capturing and storing

    samples of the Plenoptic Function

  • 7/29/2019 16 Light

    10/58

    Building Better Cameras

    Capture more rays!

    Higher density sensor arrays

    Color cameras, multi-spectral cameras

    Video cameras

  • 7/29/2019 16 Light

    11/58

    Modify Optics: Wide-Angle Imaging

    Examples: Disney 55, McCutchen 91, Nalwa 96,Swaminathan & Nayar 99, Cutler et al. 02

    Multiple Cameras Catadioptric Imaging

    Examples: Rees 70, Charles 87, Nayar 88,Yagi 90, Hong 91, Yamazawa 95, Bogner 95,Nalwa 96, Nayar 97, Chahl & Srinivasan 97

  • 7/29/2019 16 Light

    12/58

    Catadioptric Cameras for 360 Imaging

  • 7/29/2019 16 Light

    13/58

    Omnidirectional Image

  • 7/29/2019 16 Light

    14/58

    Femto Photography

    FemtoFlash

    UltraFastDetector

    Computational Optics

    Serious Sync

    Kirmani, Hutchison, Davis, Raskar,

    ICCV, 2009

    A trillion frameper second

    camera

    http://www.youtube.com/watch?v=9xjlck6W020

    http://www.youtube.com/watch?v=9xjlck6W020http://www.youtube.com/watch?v=9xjlck6W020
  • 7/29/2019 16 Light

    15/58

    How Much Light is Really in a Scene?

    Light transported throughoutscene along rays Anchor

    Any point in 3D space

    3 coordinates

    Direction Any 3D unit vector

    2 angles

    Total of 5 dimensions

    Radiance remains constantalong ray as long as inempty space Removes one dimension

    Total of 4 dimensions

    L1 L2

    dA1 dA2

    dw1dw2

    radiance

    constant

    here

  • 7/29/2019 16 Light

    16/58

    Ray

    Ignoring time and color, one sample:

    5D

    3D position

    2D direction

    P(q, f, VX, VY, VZ)

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    17/58

    Surface Camera

    no change in

    radiance

    Lighting

    How can we use this?

  • 7/29/2019 16 Light

    18/58

    Ray Reuse

    Infinite line

    Assume light is constant (vacuum)

    4D

    2D direction

    2D position

    non-dispersive medium

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    19/58

    Only need Plenoptic Surface

  • 7/29/2019 16 Light

    20/58

    Light Field - Organization

    2D position

    2D direction

    s

    q

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    21/58

    Light Field - Organization

    2D position

    2D position

    2 plane parameterization

    s

    u

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    22/58

    Light Field - Organization

    2D position

    2D position

    2 plane parameterization

    us

    t s,t

    u,v

    v

    s,t

    u,v

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    23/58

    Light Field - Organization

    Holds, tconstant

    Let u, v vary

    An image

    s,t u,v

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    24/58

    Light Field / Lumigraph

  • 7/29/2019 16 Light

    25/58

    Light Field - Capture

    Idea 1

    Move camera carefully overs, tplane

    Gantry

    s,t u,v

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    26/58

    Gantry

    Lazy Susan Manually rotated

    XY Positioner

    Lights turn with lazy susan

    Correctness byconstruction

  • 7/29/2019 16 Light

    27/58

  • 7/29/2019 16 Light

    28/58

    Light Field - Rendering

    q For each output pixel

    determine s, t, u, v

    either use closest discrete RGB

    interpolate near valuess u

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    29/58

    Light Field - Rendering

    Nearest closest s closest u

    draw it

    Blend 16 nearest quadrilinear interpolation

    s u

    Slide by Rick Szeliski and Michael Cohen

  • 7/29/2019 16 Light

    30/58

    Devices for Capturing Light Fields

    smallbaseline

    bigbaseline

    handheld camera [Buehler 2001]

    camera gantry [Stanford 2002]

    array of cameras [Wilburn 2005]

    plenoptic camera [Ng 2005] light field microscope [Levoy 2006]

    (using geometrical optics)

  • 7/29/2019 16 Light

    31/58

    Multi-Camera Arrays

    Stanfords 640

    480pixels 30 fps 128cameras

    synchronized timing

    continuous streaming

    flexible arrangement

  • 7/29/2019 16 Light

    32/58

    Stanford Tiled Camera Array

  • 7/29/2019 16 Light

    33/58

    Whats a Light Field Good For?

    Synthetic aperture photography

    Seeing through occlusions

    Refocusing Changing Depth of Field

    Synthesizing images from novel

    viewpoints

  • 7/29/2019 16 Light

    34/58

    Synthetic Aperture Photography[Vaish CVPR 2004]

    45 cameras aimed at bushes

  • 7/29/2019 16 Light

    35/58

    Synthetic Aperture Photography

    One image of peoplebehind bushes

    Reconstructed syntheticaperture image

  • 7/29/2019 16 Light

    36/58

    Synthetic Aperture Photography

  • 7/29/2019 16 Light

    37/58

    Li ht Fi ld Ph t h i H dh ld

  • 7/29/2019 16 Light

    38/58

    Light Field Photography using a HandheldLight Field Camera

    Ren Ng, Marc Levoy, Mathieu Brdif,Gene Duval, Mark Horowitz and Pat

    Hanrahan

    Proc. SIGGRAPH 2005

    Source: M. Levoy

    L t Li ht Fi ld C

  • 7/29/2019 16 Light

    39/58

    Lytro Light Field Camera

    www.lytro.com

    $400 (Nov. 2011)

    8x optical zoom, f/2 lens, 8 GB memory (3501080 x 1080 images)

    http://www.lytro.com/http://www.lytro.com/
  • 7/29/2019 16 Light

    40/58

    Conventional vs Light Field Camera

    Source: M. Levoy

  • 7/29/2019 16 Light

    41/58

    Conventional vs Light Field Camera

    uv-plane st-plane

    Source: M. Levoy

  • 7/29/2019 16 Light

    42/58

    Conventional vs Light Field Camera

    uv-planest-plane

    Source: M. Levoy

    Prototype Camera

  • 7/29/2019 16 Light

    43/58

    Prototype Camera

    4000 4000 pixels 292 292 lenses = 14 14 pixels per lens

    Contax medium format camera Kodak 16-megapixel sensor

    Adaptive Optics microlens array 125 square-sided microlenses

  • 7/29/2019 16 Light

    44/58

    Mechanical Design

    microlenses float 500 above sensor

    focused using 3 precision screws Source: M. Levoy

  • 7/29/2019 16 Light

    45/58

  • 7/29/2019 16 Light

    46/58

    a b c

    a

    b

    c

    (a) illustrates microlenses at depths closer than the focal plane. In these

    right-side up microlens images, the womans cheek appears on the left, asit appears in the macroscopic image. In contrast, (b) illustrates

    microlenses at depths furtherthan the focal plane. In these invertedmicrolens images, the mans cheek appears on the right, opposite the

    macroscopic world. This effect is due to inversion of the microlens rays as

    they pass through the world focal plane before arriving at the main lens.Finally, (c) illustrates microlenses on edges at the focal plane (the fingersthat are clasped together). The microlenses at this depth are constant incolor because all the rays arriving at the microlens originate from thesame point on the fingers, which reflect light diffusely.

    Di it ll St i D

  • 7/29/2019 16 Light

    47/58

    Digitally Stopping-Down

    stopping down = summing only the central

    portion of each microlens

    Source: M. Levoy

    Digital Refocusing

  • 7/29/2019 16 Light

    48/58

    Digital Refocusing

    refocusing = summing windows extracted

    from several microlenses

    Source: M. Levoy

    Example of Digital Refocusing

  • 7/29/2019 16 Light

    49/58

    Example of Digital Refocusing

    Source: M. Levoy

    E t di th D th f Fi ld

  • 7/29/2019 16 Light

    50/58

    Extending the Depth of Field

    conventional photograph,main lens at f/ 22

    conventional photograph,main lens at f/ 4

    light field, main lens at f/ 4,after all-focus algorithm

    [Agarwala 2004]

    Digitally Moving the Observer

  • 7/29/2019 16 Light

    51/58

    Digitally Moving the Observer

    moving the observer = moving the window

    we extract from the microlenses

    Source: M. Levoy

    Example of Moving the Observer

  • 7/29/2019 16 Light

    52/58

    Example of Moving the Observer

    Source: M. Levoy

    Moving Backward and Forward

  • 7/29/2019 16 Light

    53/58

    Moving Backward and Forward

    Source: M. Levoy

  • 7/29/2019 16 Light

    54/58

    http://lightfield.stanford.edu/lfs.html
  • 7/29/2019 16 Light

    55/58

    Light Field Demos (Stanford)

    Implications

    http://lightfield.stanford.edu/lfs.htmlhttp://lightfield.stanford.edu/lfs.html
  • 7/29/2019 16 Light

    56/58

    Implications

    Cuts the unwanted link between exposure(due to the aperture) and depth of field

    Trades off (excess) spatial resolution for ability

    to refocus and adjust the perspective Sensor pixels should be made even smaller,

    subject to the diffraction limit

    36mm 24mm 2 pixels = 216 megapixels

    18K 12K pixels

    1800 1200 pixels 10 10 rays per pixel

    Source: M. Levoy

    O h S l h Pl i F i

  • 7/29/2019 16 Light

    57/58

    Other ways to Sample the Plenoptic Function

    Moving in time: Spatio-temporal volume: P(q, f, t) Useful to study temporal changes

    Long an interest of artists

    Claude Monet, Haystacks studies

    S Ti I

  • 7/29/2019 16 Light

    58/58

    Space-Time Images

    Other ways to slice theplenoptic function:

    y

    t