EGSR2005 Real Illumination from Virtual Environments Abhijeet Ghosh, Matthew Trentacoste, Helge...
-
Upload
conrad-mitchell -
Category
Documents
-
view
213 -
download
0
Transcript of EGSR2005 Real Illumination from Virtual Environments Abhijeet Ghosh, Matthew Trentacoste, Helge...
EGSR2005
Real Illumination from Virtual EnvironmentsReal Illumination from Virtual Environments
Abhijeet Ghosh, Matthew Trentacoste, Helge Seetzen, Wolfgang Heidrich
The University of British Columbia
EGSR2005
IntroductionIntroduction
Goal: Create sense of immersion in virtual world
• Reproduce real world intensity levels
• Have viewer adapt to changing environment– E.g., entering or leaving a tunnel in driving simulation
Goal: Create sense of immersion in virtual world
• Reproduce real world intensity levels
• Have viewer adapt to changing environment– E.g., entering or leaving a tunnel in driving simulation
EGSR2005
IntroductionIntroduction
• Display technology– Conventional displays
• Too low contrast/intensity• Adaptation driven by room illumination
– Recent HDR displays [Seetzen et al 2004]• Appropriate contrast & intensities• Limited field of view• Adaptation driven by room illumination & display
• Problem– Adaptation depends on viewing conditions– Viewing conditions mostly unknown
• Display technology– Conventional displays
• Too low contrast/intensity• Adaptation driven by room illumination
– Recent HDR displays [Seetzen et al 2004]• Appropriate contrast & intensities• Limited field of view• Adaptation driven by room illumination & display
• Problem– Adaptation depends on viewing conditions– Viewing conditions mostly unknown
EGSR2005
Real Illumination from Virtual EnvironmentsReal Illumination from Virtual Environments
• Active control of room illumination– Consistent with virtual world
• Used in combination with HDR displays– Wide range of uniform & directional illumination
• Triggers natural adaptation processes in HVS– E.g., moving between light & dark environments
• Directional information in peripheral view
• Active control of room illumination– Consistent with virtual world
• Used in combination with HDR displays– Wide range of uniform & directional illumination
• Triggers natural adaptation processes in HVS– E.g., moving between light & dark environments
• Directional information in peripheral view
EGSR2005
Related WorkRelated Work
• Tone mapping operators with visual adaptation– Threshold vs. intesity curve [Ferwerda et al 1996]– Supra-threshold brightness, color & acuity [Pattanaik et al 1998]
• Time dependent state of adaptation– Recent history of viewing conditions, e.g. [Irawan et al 2005],
[Scheel et al 2000][Durand & Dorsey 2000][Pattanaik et al 2000]
• Impact of room-lighting on ergonomic factors– Sensors to adjust display brightness & contrast
e.g. [Antwerp 1985][Berbier et al 1991]
• Perceived brightness vs. luminance– Assumption room lighting primary factor of adaptation
e.g. [Bartleson et al 1967][DeMarsh 1972]
• Tone mapping operators with visual adaptation– Threshold vs. intesity curve [Ferwerda et al 1996]– Supra-threshold brightness, color & acuity [Pattanaik et al 1998]
• Time dependent state of adaptation– Recent history of viewing conditions, e.g. [Irawan et al 2005],
[Scheel et al 2000][Durand & Dorsey 2000][Pattanaik et al 2000]
• Impact of room-lighting on ergonomic factors– Sensors to adjust display brightness & contrast
e.g. [Antwerp 1985][Berbier et al 1991]
• Perceived brightness vs. luminance– Assumption room lighting primary factor of adaptation
e.g. [Bartleson et al 1967][DeMarsh 1972]
EGSR2005
Related WorkRelated Work
• Light sensitive display [Nayar et al 2004]– Track changes in illumination for shading virtual objects
• Light Stage 3 [Debevec et al 2002]– Illuminate actors for consistency with virtual world
• Philips Ambilight TV– Light emanates from the back onto the wall– Reduces viewing contrast
• Fully immersive CAVE [Cruz-Neira et al 1993]– Very limited contrast– HDR CAVEs infeasible
• Physical light props in theme parks
• Light sensitive display [Nayar et al 2004]– Track changes in illumination for shading virtual objects
• Light Stage 3 [Debevec et al 2002]– Illuminate actors for consistency with virtual world
• Philips Ambilight TV– Light emanates from the back onto the wall– Reduces viewing contrast
• Fully immersive CAVE [Cruz-Neira et al 1993]– Very limited contrast– HDR CAVEs infeasible
• Physical light props in theme parks
EGSR2005
Prototype ImplementationPrototype Implementation
• Entertainment applications– Gaming environments– Home theaters
• 24 RGB LED lights (ColorKinetics iColor Cove)– Each addressable to 24-bit RGB– Specified at 52.4 Lumens for full white
• Ambient as well as directional illumination– Positioned outside direct field of view– Low resolution version of virtual environment
• Entertainment applications– Gaming environments– Home theaters
• 24 RGB LED lights (ColorKinetics iColor Cove)– Each addressable to 24-bit RGB– Specified at 52.4 Lumens for full white
• Ambient as well as directional illumination– Positioned outside direct field of view– Low resolution version of virtual environment
EGSR2005
Prototype ImplementationPrototype Implementation
Room housing our system, with all lights switched on
Illumination resembling the Grace Cathedral environment
EGSR2005
Room SetupRoom Setup
• Room dimensions – 15.5’ long, 9’ wide, 9.5’ high
• Room kept as is: original pastel color, white boards, reflectors
• 18’’ LED HDR display [Seetzen et al 2004]
• 18’’ NEC Multisync LCD 1850e display
EGSR2005
DiffusionDiffusion
Undiffused Diffused
Diffusers for smooth variation & preventing color banding
• 2’’ transparent acrylic tubing, spray painted white
EGSR2005
Light SetupLight Setup
Opened iColor Cove
Reflective film inside
• Avoid internal reflection losses
EGSR2005
Geometric CalibrationGeometric Calibration
Light probe images for the 24 lights
• Light probe placed at intended viewer position
- Photographed using a WebCam
• Model each light as a Gaussian centered around the brightest spot
EGSR2005
White Point CalibrationWhite Point Calibration
• Establish relative intensities of lights & display– Light adjustment using 18% gray card
• Match gray card reflection with 18% monitor gray– Response function of display– Channel wise for RGB
• Recover global scaling factors– Modulate light intensities at run time
• Needs to be repeated if setup changes
• Establish relative intensities of lights & display– Light adjustment using 18% gray card
• Match gray card reflection with 18% monitor gray– Response function of display– Channel wise for RGB
• Recover global scaling factors– Modulate light intensities at run time
• Needs to be repeated if setup changes
EGSR2005
Content Creation - Synthetic ScenesContent Creation - Synthetic Scenes
• EM for viewer position easily computable– Commonplace in shading– Many applications already support generation– Cube map or lat-long from ray tracer– Cube map for graphics hardware
• Electronic Art’s Need for Speed 2– Layered 8 bit EMs correspond to lights, sky, geometry– While LDR, each layer can be scaled independently to give coarse
HDR data
• EM for viewer position easily computable– Commonplace in shading– Many applications already support generation– Cube map or lat-long from ray tracer– Cube map for graphics hardware
• Electronic Art’s Need for Speed 2– Layered 8 bit EMs correspond to lights, sky, geometry– While LDR, each layer can be scaled independently to give coarse
HDR data
EGSR2005
Content Creation - Captured FootageContent Creation - Captured Footage
• Existing footage– Manual authoring of lighting environment– Mapping video frame to lights
• Semi-automatic method for creating lighting for HDR driving video• Take key frames of video, use periphery for EM, interpolate
– Other techniques• EMs from eyes [Nishino and Nayar 2004]
• New footage– Light probe commonly captured in scene to match special effects
with live footage– Composite of synthetic scene (effects) and actual scene (light
probe) to create EM
• Existing footage– Manual authoring of lighting environment– Mapping video frame to lights
• Semi-automatic method for creating lighting for HDR driving video• Take key frames of video, use periphery for EM, interpolate
– Other techniques• EMs from eyes [Nishino and Nayar 2004]
• New footage– Light probe commonly captured in scene to match special effects
with live footage– Composite of synthetic scene (effects) and actual scene (light
probe) to create EM
EGSR2005
Experimental SetupExperimental Setup
• Full control of lighting environment– House lights off– Covered windows
• Focus on HDR display– Time at start given to adjust to dim lighting– LDR still used for comparison tests
• Targeting entertainment applications– User preference more important than accuracy of realism– Responses from 12 students in non-graphics related areas
• Full control of lighting environment– House lights off– Covered windows
• Focus on HDR display– Time at start given to adjust to dim lighting– LDR still used for comparison tests
• Targeting entertainment applications– User preference more important than accuracy of realism– Responses from 12 students in non-graphics related areas
EGSR2005
Experiment : Uniform Dynamic IlluminationExperiment : Uniform Dynamic Illumination
• Compare same scene with constant and uniform dynamic illumination
• HDR driving video• Users comment on bright or dark room vs.
dynamic
• Compare same scene with constant and uniform dynamic illumination
• HDR driving video• Users comment on bright or dark room vs.
dynamic
EGSR2005
Experiment : Directional IlluminationExperiment : Directional Illumination
• Using interactive Grace Cathedral EM• Both constant and dynamic illumination• Users comment on how much it enhanced their
sense of orientation and their overall preference
• Using interactive Grace Cathedral EM• Both constant and dynamic illumination• Users comment on how much it enhanced their
sense of orientation and their overall preference
EGSR2005
Experiment : Need for Speed FootageExperiment : Need for Speed Footage
• Playing videos dumped from NFS2 with EMs captured at camera position of each frame
• Compare constant illumination with playing captured EMs along with video
• Playing videos dumped from NFS2 with EMs captured at camera position of each frame
• Compare constant illumination with playing captured EMs along with video
EGSR2005
Experiments - VideoExperiments - Video
RealIllum.mov
EGSR2005
DiscussionDiscussion
• Strong support for lighting used with HDR display– All participants preferred to constant illumination– Closer match between display and room intensity
• Usable with conventional displays– Participants responded predominantly positive– One participant found LDR display + EM irritating, but still preferred
HDR display + EM to HDR display + constant– More work needed to find best match in this scenario
• Other form factors– Compact, ceiling mounted projection system– Interface with controls of home automation lighting
• Strong support for lighting used with HDR display– All participants preferred to constant illumination– Closer match between display and room intensity
• Usable with conventional displays– Participants responded predominantly positive– One participant found LDR display + EM irritating, but still preferred
HDR display + EM to HDR display + constant– More work needed to find best match in this scenario
• Other form factors– Compact, ceiling mounted projection system– Interface with controls of home automation lighting
EGSR2005
ConclusionsConclusions
• Active control of room illumination– Match virtual world
• Trigger natural adaptation processes in HVS
• Directional illumination in peripheral view
• User survey to verify the concept– Overwhelming support with HDR display– Mostly positive response with conventional display
• Active control of room illumination– Match virtual world
• Trigger natural adaptation processes in HVS
• Directional illumination in peripheral view
• User survey to verify the concept– Overwhelming support with HDR display– Mostly positive response with conventional display
EGSR2005
Future WorkFuture Work
• Artistic tools for content creation
• Evaluation of performance in task-oriented apps.
• Repackaging lights into single housing– Light sensors for calibration
• Plugging into existing home-automation systems
• Artistic tools for content creation
• Evaluation of performance in task-oriented apps.
• Repackaging lights into single housing– Light sensors for calibration
• Plugging into existing home-automation systems
EGSR2005
AcknowledgementsAcknowledgements
• Electronic Arts for “Need for Speed Underground 2”
• Paul Debevec and graphics group at K.U. Leuven for HDR environment maps
• Grzegorz Krawczyk and Rafal Mantiuk for HDR driving video
• ATI Technologies Fellowship
• Electronic Arts for “Need for Speed Underground 2”
• Paul Debevec and graphics group at K.U. Leuven for HDR environment maps
• Grzegorz Krawczyk and Rafal Mantiuk for HDR driving video
• ATI Technologies Fellowship