Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes....

58
Real-Time Rendering of Natural Illumination Graduation/Master’s Thesis, 20 credits Tutor: Anders Backman Simon Rönnberg April 21, 2004

Transcript of Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes....

Page 1: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Real-Time Rendering ofNatural Illumination

Graduation/Master’s Thesis, 20 credits

Tutor: Anders Backman

Simon Rönnberg

April 21, 2004

Page 2: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One
Page 3: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

April 2004

Abstract

How to deal with outdoor scenes in computer graphics has until recentlybeen a neglected topic. Most research in computer graphics have tradition-ally been geared towards indoor scenes. These methods cannot be used forrendering natural illumination in outdoor scenes.

Lately, researchers have begun examining outdoors rendering in greaterdetail. One of the most important parts for rendering outdoor scenes is thatthe natural illumination, the light from the sun and the sky, is rendered cor-rectly with shadows.

Several methods for doing large-scale illumination is examined and theiradvantages and disadvantages are explained. The use of spherical harmonicsto increase real-time rendering performance is examined in detail.

The daylight models developed by the International Commision on Illu-mination are examined using spherical harmonics as a basis. The thesis isconcluded with a discussion of the performance and the visible results of thissystem. The reasons for using these methods are also discussed.

C

Page 4: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

D

Page 5: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

CONTENTS April 2004

Contents

1 Introduction 11.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Goal of the thesis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.3 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

2 Lighting 32.1 Basics - the physics of light . . . . . . . . . . . . . . . . . . . . . . . . . 32.2 Direct illumination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.3 Global illumination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3 Global illumination 113.1 Raytracing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113.2 Radiosity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123.3 Photon mapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.4 Spherical harmonic lighting . . . . . . . . . . . . . . . . . . . . . . . . . 16

4 Spherical harmonic lighting - in depth 194.1 Legendre polynomials - the basis . . . . . . . . . . . . . . . . . . . . . . 194.2 Spherical harmonics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214.3 Projecting to spherical harmonics . . . . . . . . . . . . . . . . . . . . . . 224.4 Lighting diffuse surfaces with spherical harmonics . . . . . . . . . . . . 24

4.4.1 Diffuse unshadowed transfer . . . . . . . . . . . . . . . . . . . . 254.4.2 Diffuse shadowed transfer . . . . . . . . . . . . . . . . . . . . . 254.4.3 Diffuse interreflected transfer . . . . . . . . . . . . . . . . . . . 26

4.5 Rendering spherical harmonics lighting . . . . . . . . . . . . . . . . . . 26

5 Implementation 295.1 OpenSceneGraph - the framework . . . . . . . . . . . . . . . . . . . . . 295.2 System design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295.3 System usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

6 Lighting models 336.1 Implementing light with spherical harmonics . . . . . . . . . . . . . . . 336.2 Analytic models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

6.2.1 Older CIE models . . . . . . . . . . . . . . . . . . . . . . . . . 346.2.2 CIE General Sky model . . . . . . . . . . . . . . . . . . . . . . 36

6.3 Integration with framework . . . . . . . . . . . . . . . . . . . . . . . . . 36

7 Results 397.1 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397.2 Visible results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

i

Page 6: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

8 Discussion 458.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 458.2 Performance and results . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

9 Acknowledgments 46

References 47

ii

Page 7: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

LIST OF FIGURES April 2004

List of Examples

1 Calculation of transfer coefficients . . . . . . . . . . . . . . . . . . . . . 312 Rendering using SH lighting . . . . . . . . . . . . . . . . . . . . . . . . 313 Animation callback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

List of Figures

1 Light-matter interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 Law of Reflection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Spherical coordinates . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 Differential solid angle . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 The raytracing concept . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Form factor geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137 Hemicube evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148 Spherical harmonics basis functions . . . . . . . . . . . . . . . . . . . . 229 Class dependencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3010 CIE Overcast Sky . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3511 CIE Clear Sky . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3512 CIE Partly Overcast Sky . . . . . . . . . . . . . . . . . . . . . . . . . . 3613 4000 vertices with shadows . . . . . . . . . . . . . . . . . . . . . . . . . 3914 50000 vertices with shadows . . . . . . . . . . . . . . . . . . . . . . . . 4015 50000 vertices without shadows . . . . . . . . . . . . . . . . . . . . . . 4016 4000 vertices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4117 50000 vertices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41

iii

Page 8: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

iv

Page 9: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

1 INTRODUCTION April 2004

1 Introduction

1.1 Background

Most research in computer graphics with respect to real-time applications have been ontechniques to render indoor scenes. Most of these can’t be used when looking at outdoorscenes and rendering of natural illumination.

With the increasing computing power of standard computers and increased performanceand capabilities of graphics hardware, outdoor rendering is becoming more and morefeasible. Lately, many researchers have begun looking at realisitic rendering of outdoorscenes.

The big difference between indoor and outdoor rendering, when considering light only, isthat the sky will usually illuminate a point from all directions. Because of this, most tra-ditional rendering techniques are unfeasible for the outdoors. What is needed is basicallya way of describing the entire sphere of influence around each point in the scene in sucha way that real-time rendering is possible.

The most important issue when dealing with outdoor scenes is the visual realism of all thenatural light sources, such as the sun, the moon and the sky. The effects of shadows castfrom objects as a result of skylight and sunlight is extremely important for any outdoorscene. The fact that shadows move over time, and the fact that the shadows are often verysoft, due to the entire sphere of influence around a point contributing to the illuminationat the point are very important to realisticly render the outdoors.

There exist numerous techniques for rendering arbitrary illumination models with shad-ows and contribution from the entire sphere of influence. Most of these techniques arebatch-oriented, however, and not really suited for real-time applications.

This thesis will focus on rendering daylight in real-time for static scenes, where the illu-mination function can be changed interactively.

1.2 Goal of the thesis

The goal of this thesis is to investigate and implement a rendering and lighting systemthat can take geometric specifications and a parameterized lighting model and create anatural looking model. The parameters can include time of day, day of year, geographiclocation and amount of clouds in the sky.

There exists several tools on the market for illuminating 3D objects, among them 3DStudio MAX, Maya and Lightwave. The drawback with these tools are that they arebatch-oriented. That is, the model is lit, exported and finally rendered in real-time. Oneof the driving forces of this project is to implement a system where the illumination iscontrollable and as close to interactive as possible. This means that the end-user applica-tion (the viewer) can control the lighting parameters and see the visual result.

1

Page 10: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

The primary goal of the thesis is to examine the field of real-time rendering of daylight.

The secondary goal is to build a system upon OpenSceneGraph, which is an open sourcescene graph, that from a given geometry and lighting parameters can light the model ina natural looking way including static shadows. The system should be dynamic withrespect to light, controllable and realistic.

1.3 Outline

In section 1 the background and goal of the thesis is given.

Section 2 covers the basics of light and lighting. The difference between traditional light-ing methods, called direct illumination, and global illumination is explained.

In section 3 several methods for performing global illumination are investigated.

Section 4 is an in-depth exploration of spherical harmonics. What it is, how it works andhow it can be used in a rendering perspective. The mathematical foundation for sphericalharmonics is explained in detail, and basic design of a system using spherical harmonicsis described.

Section 5 describes the framework, OpenSceneGraph, used for the practical part of thisthesis. It also covers the implementation of the spherical harmonics system, using Open-SceneGraph. This section also describes how the system can be used to provide naturallighting in an application.

The different ways of implementing light sources using spherical harmonics is explainedin section 6. A few analytical models for simulating daylight are also explained. Theimplementation for these is also described.

Section 7 is a presentation of the results. Screenshots from the application can be seenhere, and also performance results.

Section 8 contains a discussion of the results from Section 7. The reasons for choosingspherical harmonics, and the future of the field is also discussed here.

Finally in section 9 the people who have helped during the work with this thesis areacknowledged.

2

Page 11: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

2 LIGHTING April 2004

2 Lighting

2.1 Basics - the physics of light

The process of rendering light correctly in computer graphics is basically the process ofapproximating the physics of light. What is needed is a way to determine, for each pointon a surface, the color that point will have in a given lighting environment. To design agood approximation, some knowledge about the physics of light interaction is needed.

Visible light, in its basic form, is electromagnetic radiation with wavelengths between380-770 nanometers. When an electromagnetic wave hits a surface a complicated, dy-namic process occurs that determines what color that surface will have to the human eye.This interaction (figure 1) between light and matter is governed by the physical propertiesof both the light and the material of the surface [18].

Figure 1: Light-matter interaction

To more accurately model the underlying physics of light, the electromagnetic wave ismodeled as a ”packet” of energy traveling at the speed of light, called a photon. Thebrightness of light is the number of photons, and the color of light is the energy containedin each photon.

When a photon hits a surface one of three things will happen. The photon will either bereflected, absorbed or transmitted.

Figure 2: Law of Reflection

If the photon is reflected it ”bounces” off the surface and keep on traveling in a differentdirection. The nature of the surface determines the angle of reflection for the photon. The

3

Page 12: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

Law of Reflection (figure 2) specifies that the angle of reflection is equal to the angle ofincidence.

When a photon reflects off a smooth surface it is called specular reflection. Most ma-terials however, are not perfectly smooth. Small irregularities and bumps in a materialwill make the angle of reflection more unpredictable. The Law of Reflection still holdsthough. The photon will travel in the direction of reflection determined by the normal tothe surface at the point of interaction. When a photon is reflected from a rough surface,it is called diffuse reflection. The angle of reflection for a diffuse surface can be modeledwith probability.

On transmission, also called refraction, the photon will travel through the surface, andcontinue traveling in some direction on the other side. A surface that refracts light isusually called transparent. The direction in which the photon travels after interactionwith the surface is determined by the refraction indexes of the two mediums.

When a photon is absorbed by matter, the photons energy will cause the atoms in thematter to excite, jump to a higher state of energy. When the atom returns to its originalstate of energy, the atom will reemit light. This process is called scattering and is veryimportant when modeling the atmosphere. When scattering occurs, not all energy isreemitted, but only part of the energy. This will give rise to a lower frequency lightwave. When light is scattered on objects smaller than the wavelength of the light, theprocess is called Rayleigh scattering [15]. It is this process that determines the color ofthe sky. Higher frequencies are scattered the most, while lower frequencies are scatteredto a much lesser degree. During the day, the atmosphere will scatter the light from thesun to somewhere in the blue range, while during the evening when light travels a greaterdistance through the atmosphere it will scatter to the red range.

How can this process be modeled on a computer?

The answer is to use Bidirectional Reflectance Distribution Functions [18], BRDFs. Bycreating such functions for all surfaces the physics of light-matter interaction becomesmuch easier to model. The BRDF function is defined as the percentage of incominglight from a specific direction at a specific point that is reflected into a specific outgoingdirection. A similar function, the BRTF, can be defined for transmission.

In general, the percentage of the light that is reflected is dependent on both the positionof the viewer and the position of the light. This is therefore included in the BRDF, and asmentioned above, the function is defined as percentage of the light outgoing in a certaindirection in comparison to light incoming from a certain direction [18].

The degree of reflection is also in general dependent on the frequency of the light. There-fore the wavelength or the frequency must also be included in the function.

A BRDF is also in general positionally invariant. This means that the light-matter in-teraction will behave differently at different points on the surface. Diffuse surfaces inparticular, which are rough in general, will reflect light differently in different positions.As we mentioned above the BRDF is defined for a specific point, so the positional in-variance is included. Homogeneous surfaces will not exhibit any positional invariance.

4

Page 13: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

2 LIGHTING April 2004

Most natural materials however are heterogeneous in nature. These can be modeled usingseveral homogeneous surfaces though [18].

The general BRDF equation is defined as follows [18]:

frλ

x ωi ωo

x - position on surfaceωi - incoming directionωo - outgoing direction

λ - wavelength

When considering the BRDF the wavelength is often omitted and the value of the BRDFis represented as a vector with 3 values for red, green and blue light. This is because these3 colors are sufficient to model any color.

There are two different classes of BRDFs, isotropic and anisotropic functions [18]. Isotropicrefers to the fact that the reflectance properties are invariant with respect to the sur-face rotating around the surface normal. Very smooth surfaces exhibit this kind of re-flectance property. Anisotropic is the opposite of isotropic and simply means that thereflectance varies when the surface is rotated around its normal. Most material in natureare anisotropic to some degree. The isotropic BRDF is important though because mostmaterials are mostly isotropic with a low degree of anisotropy.

Two important properties of BRDFs are reciprocity and conservation of energy [18].Reciprocity means that the the reflection direction can be switched with the incomingdirection and the BRDF will give the same value. Conservation of energy simply meansthat the amount of outgoing light cannot be greater than the amount of incoming light.

That is

Ωfrx ωi ωo cosθdωo 1 .

The symbol

Ωindicates the integral over a hemisphere of all directions.

Figure 3: Spherical coordinates

Lighting in general is expressed as the integral over the sphere (usually the hemisphere

5

Page 14: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

of incoming light). Since this is so, it makes more sense to express the BRDF in termsof spherical coordinates instead of Cartesian which are commonly used in 3D computergraphics. Spherical coordinates [18] are defined as

θ φ (figure 3), where θ is the angle

from the zenith axle and positive φ is the positive angle west of south in the horizontalplane.

Conversion from spherical coordinates to Cartesian coordinates can be done with thefollowing formulas:

x = sinθ cosφy = sinθ sinφz = cosθ

When talking about light coming from a certain direction, it makes no sense to consideronly a ray. Because a ray is infinitely thin virtually no photons will ever travel exactlyalong that ray. Instead a certain area of space surrounding the direction ray is considered.By defining a small patch on the surface of the unit sphere, a volume defined by the centerof the sphere and this area can be used to measure the amount of incoming light. Thisarea on the surface of the unit sphere is called a differential solid angle (figure 4) [18]. Itis defined as:

dω width height

sinθdφ sinθdθdφ

Figure 4: Differential solid angle

The entire scene surrounding a point will contribute to the color of that point. This phe-nomena is called interreflection and is extremely important for realistic lighting of naturalmaterials, diffuse in particular. This is because all surfaces reflect light onto other surfacesin the vicinity, resulting in indirect light lighting virtually every surface in a scene.

A special phenomena which this also gives rise to, is color bleeding. This occurs becausea photon changes energy, and thus color, when it reflects off a surface. When the photonthen hits another surface it will illuminate that surface with the current energy. Hence,color will ”bleed” over from one surface to all other nearby surfaces.

The amount of light outgoing in a certain direction from a point on a surface can be

6

Page 15: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

2 LIGHTING April 2004

described as the integral over the sphere of all incoming light modulated by the surfaceBRDF [6].

Lox ωo

Ω

Lx ωi ωo dωi

Since the lighting function Lx ωi ωo is in reality a product of the incoming light and

the BRDF we can rewrite it as

Lx ωi ωo fr

x ωi ωo Ei

where Ei is the incoming light.

The incoming light at a point from a certain direction is affected by three things [18, 6]:

The outgoing light on some point on some surface in the incoming direction. Thiscan be described as L

x

ωi .The geometric relationship between the point on the emitting object and the pointon the receiving object, described by G

x x .

The visibility between the two points, described by Vx x .

The incoming light is a product of these three terms.

A surface can also emit light at the point x into the outgoing direction: Lex ωo .

Combining these formulas into a single equation gives us the golden standard of render-ing, the Rendering Equation [6]:

Lox ωo Le

x ωo

Ω

frx ωi ωo L x ωi G

x x V x x dωi (1)

All lighting methods used in computer graphics are approximations of this equation.What the equation says is that the light leaving a point on a surface is the sum of thelight emitted by the surface itself and the sum of all light reflected at the point into theoutgoing direction.

2.2 Direct illumination

Traditional lighting methods in computer graphics are based on the assumption that directlight is relevant, while indirect (reflected) light is irrelevant. By taking this into account,the rendering equation can be simplified to only include light coming directly from a lightsource.

This simplified rendering equation (equation 1) looks like this:

Lox ωo Le

x ωo

n

∑j 1

frx ω j ωo L x

j ω j G x x

j V x x

j (2)

7

Page 16: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

where n is the number of point light sources in the scene.

Let’s go back to examining the amount of incoming light from a certain direction (E i).In a more specific form it is the amount of light coming in through a differential solidangle modulated by the angle between the surface normal and the incoming direction.Therefore the amount of incoming light from a point light source to a surface point canbe described by Licosθidωi, where Li is the amount of light sent out by the light source.

The most popular model for performing direct illumination is the Phong model [18]. Thismodel was developed by Phong Bui-Tuong and is defined as follows:

Lo Likd L N

ks R V n (3)

kd - diffuse surface constantks - specular surface constantL - vector from point of interaction to light sourceN - surface normal at pointR - reflection vectorV - view vectorn - phong constant

If the Phong equation is rewritten, the relationship between the Phong model and thegeneral rendering equation can be found:

Lo kd L N

ks R V n

cosθdωiLicosθdωi

So the BRDF in the Phong model is the reflectance function divided by a geometric term.Note that the Phong model doesn’t take visibility into account. This can be fixed byseveral shadowing methods however, but that is beyond the scope of this thesis.

Materials which exhibit this kind of reflectance properties can be accurately modeledusing the Phong model. There are not very many materials with this kind of reflectanceproperty, however.

An ambient term is often added to the Phong model to account for indirect light that isnot account for explicitly in the equation.

The Phong model can be used to calculate the color at a specific point on a surface. Thereare several different approaches to choosing which points to calculate the color for, andwhat to do in the spaces between those points.

The first method used historically is the one called flat shading. In this method the coloris calculated once for each polygon and the entire polygon is rendered with this color.

The second method is called Gouraud shading, and is based on calculating the color foreach vertex and then interpolating the colors over the polygon between the vertices.

The last method, which also has given name to the model, is called Phong shading. InPhong shading, the vertex normals are interpolated across the polygon and a new color iscalculated for each pixel.

8

Page 17: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

2 LIGHTING April 2004

There are numerous problems and shortcomings with direct illumination. It doesn’t takeinto account reflected light or transmitted light, which is necessary for a realistic lookingenvironment. The Phong model doesn’t accurately depict the complexity of light-matterinteraction, and it doesn’t provide shadowing. The last shortcoming can be fixed withseveral new methods for handling shadows, however. Most of these problems arise fromthe basic assumption that indirect light is irrelevant, which it most certainly is not.

Well, if direct illumination doesn’t work, how can the rendering equation be approximatedin a more accurate manner?

2.3 Global illumination

The answer is global illumination. The basic difference between global and direct illumi-nation is that global illumination considers indirect light as well as direct light. Shadowsare often included implicitly in global illumination solutions.

Global illumination in general is based on tracing photons through a scene, to fully renderimages with all light paths computed. Hence, a greater part of the rendering equation issolved and the result will be more physically correct and hence more realistic.

Global illumination techniques are often very expensive to use though, which is the mainreason that they are often used as a precomputation stage, and applied as a light map orby use of Gouraud shading.

There are several very well-known techniques in the are of global illumination. The mostfamous ones are without a doubt raytracing and radiosity.

The oldest of the global illumination techniques is raytracing (chapter 3.1). Raytracing isbased on the assumption that the only light rays that are important are those that eventuallycan be seen by the observer, the eye. The method therefore cast rays from the eye pointthrough the image into the scene, and trace those rays. One ray is sent for each pixel inthe final image usually. Backwards raytracing is the reverse of normal raytracing. Raysare sent from the light sources into the scene, and traced until they hit the observer orleaves the scene entirely.

The most popular global illumination technique is probably radiosity (chapter 3.2) [5].Radiosity is based on the assumption that all surfaces are diffuse, and as such reflect lightin all directions equally. It is a three stage method, where the first stage consists of cal-culating the percentage of outgoing light from one surface that arrives at another surface.This percentage is called the form factor, and is a geometric term. In the second stage theform factors and a description of the light in a scene is used to find an equilibrium. In thethird, and last, stage the scene is rendered using Gouraud shading.

Another technique that reminds of raytracing which has appeared on the computer graph-ics scene lately is photon mapping (chapter 3.3) [10]. Photon mapping is a two stagemethod, where in the first stage photons are shot into the scene from the light sources.The photons are tracked through the scene, and all its interactions and energy changes arestored in a photon map. In the second stage a raytracer, modified to use the photon map,

9

Page 18: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

is used to render the scene. Photon mapping computes the full rendering equation and isfaster than radiosity. It is slower than raytracing though.

Recently, more and more techniques based on projection of the transfer function andlighting to orthonormal basis functions have appeared on the computer graphics scene.Most of these are used to reduce rendering time by converting to basis functions thatchange the integral in the rendering equation to a faster operation, such as a dot productor a matrix-vector multiplication.

The most widely used basis is spherical harmonics, which is linear (chapter 3.4) [16].Spherical harmonics is used to represent low-frequency lighting, and is very interestingfor this thesis, because daylight is by nature fairly low-frequent.

Another basis that is seeing more and more use lately, is the wavelet basis. This basisis non-linear and can model light with any frequency. This means it is more completethan using spherical harmonics, but also somewhat slower. These methods are beyondthe scope of this thesis.

In chapter 3, raytracing, radiosity, photon mapping and spherical harmonic lighting willbe described in greater detail.

10

Page 19: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

3 GLOBAL ILLUMINATION April 2004

3 Global illumination

3.1 Raytracing

Raytracing was the first global illumination solution and was first presented in 1980 byTurner Whitted [17] and is based on the idea of tracing rays from the eye into a sceneoriginally designed by Arthur Appel in 1968.

Raytracing is based on the assumption that the only important light rays are those thatcan be seen by the observer, the eye (figure 5) [11]. Rays are traced from the imaginaryeye point through each pixel of the view into the scene. At the closest intersection with asurface, a shadow ray is shot. If the shadow ray hits another surface before hitting a lightsource, the surface is shaded at the intersection point.

The nature of the surface determines what happens next. If it is a diffuse surface thecolor is calculated directly, but if it is a specular surface we need to recursively apply theraytracing. We shoot a new ray either in the reflected direction or the transmitted directiondepending on if the surface is transparent or not.

This process is applied to each pixel in the view, generating a complete image [11].

Figure 5: The raytracing concept

The color of the pixel is determined by a combination of factors at the intersection pointbetween the ray and the closest surface. Direct light and the surface color are the mostimportant, but specularity and the color of the reflected, or transmitted, rays are alsoincluded.

The color of a diffuse surface can be calculated directly with the following formula [11]:

C L S n

l s L - light colorS - surface colorn - surface normall - light positions - position on the surface

and the reflection direction is calculated as follows:

11

Page 20: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

r 2 n d n

d d - light direction

To calculate the transmission rays we need refraction indexes for the participating medi-ums. The refraction indexes determine the angle a ray is deflected when penetrating thesurface between the mediums and the new direction can be computed with the following:

t m1

d n d n m2

n 1 m2

1

1

d n 2 m2

2

where m1 and m2 are the refraction indexes for the involved mediums.

This is the original raytracing concept developed by Whitted. However, some peopleargue that this is backwards raytracing, and the process of tracing rays from the lightsources to the eye is pure raytracing. Since Whitted was first though, backwards raytrac-ing will here be defined as the process of tracing light rays from the light sources to theeye.

The raytracing concept have some serious drawbacks however. The first one is that it’sview dependent, which means it is quite bad for real-time performance since the viewwill need to be completely updated every frame. The assumption that the color of adiffuse surface can be calculated directly is also false, since the entire hemisphere arounda diffuse surface contributes to its color. Because of this raytracing cannot model diffusesurfaces correctly.

Raytracing can’t handle color bleeding, which is based on diffuse interreflection betweensurfaces close to each other, correctly either, and shaded regions in the scene won’t beilluminated by reflected light.

Many of these drawbacks can be dealt with by adding techniques to the basic raytracingconcept. The most important addition to date is Monte-Carlo integration. This methodwill be described in detail in a later chapter. In short, it can be proved that by using auniform distribution random function the integral over the hemisphere can be reduced to asum of random values weighted by a weighting function. What this means for raytracing,is that for a diffuse surface we can randomly choose a number of rays and send out theseinto the scene. The resulting color of the diffuse surface will then also be affected bythe light coming back from these rays. The drawback with Monte-Carlo integration is ofcourse that it will cause a performance drop, since more reflection rays must be processed.

3.2 Radiosity

Radiosity is currently the most widely used global illumination technique. Most gamedevelopers use radiosity to precalculate lightmaps that are used as textures in a multi-passsystem during rendering.

Radiosity is based on radiative heat transfer and because of this takes into account diffusereflection between surfaces [3]. Diffuse surfaces, as mentioned above, reflects light in alldirections with a certain probability, hence all surfaces receive illumination from all light

12

Page 21: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

3 GLOBAL ILLUMINATION April 2004

sources and diffuse surfaces it can ”see”.

The basic assumption made in radiosity is that all surfaces are diffuse. Because of thisthe rendering equation can be reduced, and the BRDF can be moved out of the integralover the hemisphere (because it behaves the same way for each incoming direction). Thesimplified rendering equation can be formulated as [3]:

Lo Le

fr

Ω

Licosθdωi (4)

The radiosity algorithm is divided into three passes. First, the form factors for all sur-faces are calculated. The form factor describes the percentage of outgoing light from onesurface that is visible to the second surface. For the form factor calculation to be accuratefor shadows it might be necessary to first subdivide the model to get better resolution.Note that for a scene with n surfaces, n2 form factors are necessary. In the second pass,the form factors and a description of the light in the scene is used to find an equilibrium,meaning that there are no more changes in illumination. During the third stage the sceneis rendered using Gouraud shading.

To find the form factor for two surfaces (figure 6), what is needed is to find the fractionalcontribution of the first surface on the second [3]. This is basically a geometric termdependent on the size and orientation of the 2 surfaces, the distance between them andvisibility between them.

Figure 6: Form factor geometry

The expression relating a finite patch to another finite patch is basically an integral overthe two surfaces relating the cosine of the normals:

FdAi dA j

1Ai

Ai

A j

cosθicosθ jdA j

πr2 (5)

There are several methods for evaluating this integral. The first one is called ContourIntegral evaluation [3]. Stokes theorem can be used to transform this formula into a formmore suitable for evaluation, but it is extremely slow and thus not applicable for ourpurposes. Also note that this formula doesn’t take visibility into account. It’s very easyto add visibility to the final algorithm, however.

13

Page 22: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

A faster way of computing the form factors is a method called hemicube evaluation (figure7). This method was first introduced by Cohen in 1985 [1]. Hemicube evaluation is basedon creating a sphere surrounding the receiving patch. To calculate the form factor betweenthe receiving patch and another sending patch, the sending patch is first projected onto thehemisphere surrounding the receiving patch, and then the resulting projection is projectedonto the base of the hemisphere (the receiving patch).

The first projection accounts for the cosine of the angle between the projected normal andthe hemisphere and also a 1

r term. The projection from the hemicube onto the base of thesphere accounts for the other cosine term and another 1

r term.

The area projected onto the base of the hemisphere is therefore the form factor for thetwo patches.

To speed this technique up somewhat, the hemisphere can be replaced with a cube withthe receiving patch at the center. The surface of the cube is divided into a set of pixels.Each pixels contribution to the form factor can then be pre-calculated. What remains thenis to project patches onto the hemicube and determining which pixels to use.

Figure 7: Hemicube evaluation

There a two primary problems with using hemicube evaluation. Since only the integralover the sending patch is computed, the resulting form factor can be inaccurate for patcheswhose size is large compared to the distance between the patches. Aliasing can also occurin the form factor because of the fact regularly spaces solid angles are used. Small patchescan in some cases disappear altogether from the calculation.

Solving the second pass in the radiosity algorithm can be done in two ways. Either theequation system can be solved directly. This is not very fast though, due to the fact thatthe equation system is huge. For a scene with n patches the equation system has n2 terms.Solving this will take a long time for any kind of equation solver.

The second method for computing the second pass is to solve the equation system iter-atively. This method is called progressive radiosity. The equation system can be solvediteratively in two ways. The first method is to converge to the solution by solving onerow at the time. This is commonly known as a light gathering solution. The solution canalso be found by distributing light to its environment from a single patch and repeatingthis for all patches that emits light. This is known as a light scattering solution. If we usean estimate for the remaining unscattered light in the scene as the ambient term, we can

14

Page 23: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

3 GLOBAL ILLUMINATION April 2004

even render the scene while we are solving the system. By constantly keeping a list ofpatches sorted by the remaining unshot light in them and always picking the patch withthe most remaining light, the system will converge faster.

The advantage radiosity has over raytracing is that it can model indirect light and colorbleeding and that it is independent to the view, which means that radiosity can be used asa precalculation technique. Radiosity is also view-independent, which is a big advantagecompared to raytracing. This mean that illumination can be precalculated.

The biggest drawback with radiosity is that it is expensive to calculate. It is much moreexpensive than raytracing, for instance. Another big drawback is the assumption thatradiosity is based, namely that all surfaces are diffuse. This is not the case. Because ofthat assumption no reflective or transparent surfaces can be modeled using radiosity. Aproblem also exist when using non-planar object such as spheres. The form factor fora sphere is very difficult to calculate. The only solution is to discretisize the object intopatches before calculating the form factors. This will result in an even bigger performancedrop.

3.3 Photon mapping

Photon mapping is a fairly new technique for global illumination. It was first described byHenrik Wann Jensen in his Ph.D. dissertation in 1996 [10]. Photon mapping is a techniquethat combines diffuse and specular reflections at the speed of raytracing. Photon mappingis interesting because it captures the entire Rendering Equation.

The basis of photon mapping is very simple to understand. Photons are shot into a scene,and the energy left behind and the photons themselves are tracked and recorded.

Photon mapping is a two pass algorithm [10]. In the first pass photons are shot from lightsources and tracked through space. When a surface is encountered, the energy contribu-tion the photon gives to that surface is recorded and the photon is reflected. If the photonhits a diffuse surface, it is stored in a global photon map. In the second pass ordinary ray-tracing is used in conjunction with the photon map to calculate the complete illuminationfor the scene.

The first pass starts with choosing a light source at random. The probability of choosinga specific light source is proportional to the power of the light emitted by that light incomparison with the total amount of light in the scene. The next step is to choose anorigin and directional vector of the photon. Again this is chosen at random proportionallyto the size of the light source. The photon is then shot into the scene. Once the closestintersection with a surface is found it is time to update the global photon map. Whathappens next is determined by the nature of the surface intersected with. If the surface isspecular, the photon map is not updated. If the surface is diffuse the energy and incidentdirection is stored in the photon map. If the surface is a combination of diffuse andspecular, one of the 2 scenarios are chosen at random, proportional to the amount ofspecularity of the surface.

15

Page 24: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

The BRDF for the surface is then used to choose whether the photon is reflected or ab-sorbed. If the photon is reflected, a reflection ray is calculated and the process is repeatedfor the new direction. The direction of the reflection ray is determined by the nature ofsurface. If the surface is specular, perfect reflection occurs and the photon is reflectedaccording to the Law of Reflection. If the surface is diffuse, a direction is chosen at ran-dom, proportional to the cosine between the perfect reflection direction and the actualreflection direction.

This process is repeated for each photon that is chosen. Once a sufficient number ofphotons are shot, they are sorted and stored. The suggestion given by Jensen [10] is touse a k-d tree data structure to store the photons.

The second pass creates an image much like raytracing does. A ray is shot from the eyeinto the scene. If the first intersected surface is specular, a single reflection ray is sent out.If the surface is diffuse, several reflected rays are sent out. For each reflection ray thathits another surface a radiance estimate is calculated. The estimate finds the m closestphotons in the global photon map with some distance. This is the reason for using a k-dtree, the search procedure for finding the closest photons is very fast compared to manyother data structures. The energy of these photons are then summed and divided by thearea in which they were found. For the final illumination of the intersection point, theestimates are averaged and combined with direct light.

To model color bleeding, the color of a photon should be changed on reflection (or trans-mission), and stored in the photon map along with the energy and position.

This technique has been used as an offline technique up until 2002, when Timothy Joz-wowski described a way to use photon mapping in real time in his master thesis [11].His suggestion is to reshoot a part of the photons every frame, and update the photonmap accordingly. Ordinary raytracing speedup techniques can also be used for increasedperformance.

Photon mapping biggest drawback still is its speed, despite Jozwowskis [11] investigationof the techniques real time performance. It is also very hard to model large area lightsources (such as analytical atmosphere models). The questions that arise when trying tomodel such a light source is how to choose the number of photons that should be shotfrom the light source and the origin of the photon on the light source.

There are also a few issues which must be dealt with. These are basically how manyphotons in total that should be shot, how many photons that should be reshot every frameand the distribution of these photons.

3.4 Spherical harmonic lighting

Spherical harmonic lighting is a new, emerging technique, which was first introduced byKautz, Sloan and Snyder [16]. It is a technique used primarily to model low-frequentlight for real-time rendering.

Low-frequent light can be described as light coming from large area light sources with

16

Page 25: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

3 GLOBAL ILLUMINATION April 2004

fairly stable luminance [6]. This means that this method is very suited for daylight sit-uations, because daylight is very low-frequent (the light source is basically the entirehemisphere around us).

Spherical harmonic lighting [16, 12] is based on projecting the transfer function (theBRDF multiplied with the cosine of the incoming angle basically) to the orthonormalspherical harmonics basis functions. This yields a coefficient vector of length n2, wheren is the number of bands of functions in the Legendre family of polynomials that isused. The Associated Legendre polynomials are defined only in one dimension. Sphericalharmonics use these basis functions to define the basis functions in two dimensions. Moreabout this in a chapter 4.

Spherical harmonic lighting is used for diffuse surfaces mostly, which can be modeledwith fairly few coefficients (on the order of 9-25). Glossy surfaces can also be modeledto some extent by using a matrix of coefficients instead of a vector. Highly specularsurfaces, however, can’t be modeled.

In the article published by Kautz, Sloan and Snyder [16], the transfer function used fora diffuse surface is simply the cosine term of the incoming angle. This function mul-tiplied with the spherical harmonics basis functions are numerically integrated to yieldcoefficients with describe an objects shaded response to its environment.

If shadows are required, a simple visibility term is added to the transfer function. Theeasiest way of including this is with a simple ray tracer that returns whether the ray hitsomething or not.

The approach taken by the authors in [16] is to precalculate the transfer function coeffi-cients for the geometry. The coefficients are then stored in a convenient way, to be usedin the rendering stage.

During the rendering stage, the light function is projected to the spherical harmonics basisand a dot product is preformed between the light function and the vertex transfer functionfor all vertices. This will yield the brightness at the specific vertex. The reason that onlya dot product is needed is because by using orthonormal basis functions the dot productin the new basis is equal to the integral in the old basis.

Spherical harmonics can also model interreflections, by iteratively determining indirectlight and adding this to the transfer function coefficients.

The biggest problem with using spherical harmonics is that it is very expensive to com-pute the transfer function coefficients. This means that dynamic geometries are a bigproblem, because if a geometry is translated or rotated all transfer functions need to berecalculated. The method is also not very well-suited for point lights, since these usuallyexhibit high-frequent lighting situations, which spherical harmonics cannot model. Spec-ular and transparent surfaces is also a big problem to model, with the approach taken byKautz, Sloan and Snyder.

The last problem is that the transfer functions are computed for each vertex. This meanthat for models with few polygons shadows will have very low resolution, which can

17

Page 26: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

result in odd situations. A better approach would be to calculate the transfer functions astextures and using a pixel shader to determine the brightness at each point. This is beyondthe scope of this thesis, however.

18

Page 27: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

4 SPHERICAL HARMONIC LIGHTING - IN DEPTH April 2004

4 Spherical harmonic lighting - in depth

4.1 Legendre polynomials - the basis

Spherical harmonics [16, 12] are based on a family of orthogonal basis functions. Ba-sis functions are small pieces of signals that can be scaled and combined to create anapproximation to the original function.

To approximate a function, a scalar value is calculated for each basis function B ix that

represents how much the original corresponds to that basis function. This process is calledprojection and involves transforming the functions into the space defined by these basisfunctions. Projection consists of integrating the product of the function we wish to projectand the basis function in question ( f

x Bi

x ). This results in a vector of coefficients that

can be used to recreate the original function.

To project the function back from the space defined by the basis function to its originalspace, the basis functions are simply scaled by these coefficients and summed. Thisresults in an approximation of the old f

x .

Orthogonal basis functions [6] are very important because of one of their fundamentalproperties: integration of the product of any two of them returns either a constant value,if they are the same function, or zero, if they aren’t the same function. 1

1Fm

x Fn

x dx

0 n mα n m

If this integral always return either zero or one, the basis functions are said to be or-thonormal. In simpler terms, the functions do not overlap while still being in the samespace.

The family of such basis functions that spherical harmonics is based on is called theAssociated Legendre polynomials [6, 16, 12]. Ordinary Legendre polynomials returncomplex number, but in the application of spherical harmonics in computer graphics,only the real part is interesting, hence the Associated polynomials are used.

The polynomials are traditionally represented by the letter P and is defined over the range[-1,1] and take two arguments. These arguments are often denoted l and m. These breakthe family into bands of functions where l is the band index and is defined to always begreater than zero. The argument m is the function index inside a band and is defined overthe range [0,l].

The functions can be visualized as a triangular grid of nn

1 coefficients for an n band

approximation. P0

0P0

1 P11

P02 P1

2 P22

Evaluation of Legendre polynomials is very hard [6], and because of this they are rarely

19

Page 28: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

used in approximating functions. The definition of the polynomials is defined in terms ofderivatives of imaginary numbers. To minimize computational complexity, the definitionis rewritten in terms of recurrence relations.

This results in 3 rules [6]:

1. A higher band can be generated from two lower ones with the following formula:l m Pm

l x2l 1 Pm

l 1

l

m 1 Pm

l 2.

2. The second rule is the starting rule, because it need no previous values. It is definedas Pm

m

1 m 2m 1 !! 1 x2 m2 where !! is the double factorial. Since 2m 1

always is odd, the double factorial reduces to the sum of all odd numbers up to2m 1.

3. The third rule is used to lift a term to a higher band: Pmm 1 x

2m

1 Pm

m .

The algorithm is to use rule number two to get to the highest possible Pmm . If l m,

the result is found by only using this rule. After this, rule number three is used once. Ifl m

1, the result has been found. Last we use rule number one until the correct

answer has been found. The first rule is used because it causes less floating point errorsthan rule three does.

The algorithm for evaluating the Associated Legendre polynomials can be defined asfollows [6]:

algorithm P(l, m, x)Pm

m 1if m 0

x2

1 x 1

x

fact = 1for i = 1..m

Pmm Pm

m

f act x2

fact = fact + 2if l == m

return Pmm

Pmm 1 x

2 m

1 Pm

mif l == m + 1

return Pmm 1

Pll 0

for ll = m+2 .. l

Pll 2 ll 1 x Pm

m 1 ll m 1 Pm

m ll m

Pmm Pm

m 1Pm

m 1 Pll

return Pll

20

Page 29: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

4 SPHERICAL HARMONIC LIGHTING - IN DEPTH April 2004

4.2 Spherical harmonics

The Legendre polynomials are all good for one dimension, but what about the two-dimensional surface of a sphere? The answer is spherical harmonics, which is built uponthe Legendre polynomials. Spherical harmonics is a mathematical system analogous tothe Fourier transform but defined on the surface of a sphere instead [6].

Spherical harmonics in general is defined on complex numbers, but as mentioned above,only the real part is used for applications in computer graphics. The real part of sphericalharmonics is called real spherical harmonics.

The system use the same two parameters that the Legendre polynomials use, namelyl and m. In addition to these, spherical harmonics use spherical coordinates (θ, φ) asparameters.

As mentioned above the conversion between Cartesian and spherical coordinates are de-fined as:

x sinθcosφy sinθsinφz cosθ

The spherical harmonics basis functions are defined as follows [6, 12, 16]:

Y ml

θ φ

2Kml cos

mφ Pm

l

cosθ m 0 2Km

l sin

mφ P ml

cosθ m 0

K0l P0

l

cosθ m 0

where P is the associated Legendre polynomials. K is a weighting function [6, 12, 16] thatuse the same parameters l and m that spherical harmonics use. K is defined as follows:

Kml

2l

1 l m !

4πl

m !The parameters l and m in spherical harmonics are the same as in the Associated Legendrepolynomials, with the difference that m is defined on another domain, [-l,l].

Flattening the table of coordinates to a single vector is very useful for our purposes:Y m

l

θ φ Yi

θ φ where i l

l

1

m.

The algorithm for K is [6]:

algorithm K(l, m)num

2 l

1 f actorial

l m

denom 4 π f actoriall

m

return

numdenom

The factorial function is best implemented as a table of precomputed values. There willnever be a need for a bigger value than two times the number of bands we wish to use.

The evaluation algorithm for the spherical harmonics basis functions is defined as [6]:

21

Page 30: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

algorithm SH(l, m, θ, φ)if m == 0 return K(l, 0) * P(l, 0, cos(θ))if m 0 return 2 * K(l, m) * cos(mφ) * P(l, m, cos(θ))return 2 * K(l, -m) * sin( mφ) * P(l, -m, cos(θ))

Plotted as spherical functions by distance from origin and by color on the unit sphere thefirst five bands of spherical harmonics basis functions can be seen in figure 8 [6].

Figure 8: Spherical harmonics basis functions

4.3 Projecting to spherical harmonics

As mentioned before, projection is the process of working out how much of each sphericalharmonics basis function the original function is like [6]. The projection of any sphericalfunction in spherical harmonics coefficients is easy once the basis functions are defined.

To project a function f ω to spherical harmonics the following integral should be calcu-

lated [12, 16, 6]:

cml

Ω

f ω Y m

l

ω dω

And to reconstruct an approximation of the old function the following sum is used:

f ω

n 1

∑l 0

l

∑m l

cml Y m

l

ω If the function is defined in spherical coordinates, which makes the most sense for aspherical function, the integration over the sphere for a function f is:

22

Page 31: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

4 SPHERICAL HARMONIC LIGHTING - IN DEPTH April 2004

0

π

0fθ φ sinθdθdφ

To project this function to spherical harmonics coefficients the product of the function fand the spherical harmonics basis function over the sphere has to be integrated [6]:

cml 2π

0 π0 f

θ φ Y m

l

θ φ sinθdθdφ

This function is extremely expensive to evaluate symbolically. What is needed is a fastway to numerically integrate a function. The answer is to use a technique called Monte-Carlo integration [6].

Monte-Carlo integration is based on probability theory. Because of this a few definitionsfrom statistics are necessary.

A random variable is a value within a specific domain with a specific random distributionover that domain. The cumulative distribution function P

x is defined as the probability

of getting a value that is less than x. If the probability of getting any value is equal, therandom variable is called a uniform random variable. If the variable returns values in therange 0 1 it is called a canonical uniform random variable and is denoted by ξ.

The interesting part of probability theory is the probability density function. It is definedas the relative probability that a random variable will take on a specific value, and can becalculated by taking the derivative of the cumulative distribution function. The probabilitydensity function integrates to one and is greater than or equal to zero for all values. ∞

∞px dx 1 where p

x 0

Another way of calculating the probability density function is to take the mean of a largenumber of samples of the function. This sum converges to the correct answer as thenumber of samples increases towards infinity.

This results in the Law of Large Numbers:

E f x 1N

N

∑i 1

fxi

If this is combined with the normal way of calculating the probability density function away to numerically integrate a function can be defined:

fx dx

fx

px p

x dx 1

N

N

∑i 1

fxi

pxi

If the random samples are uniformly distributed, the function can be point sampled,summed and then divided by the number of samples and the probability function p

x .

Since spherical harmonics use spherical coordinates, a way to generate uniformly dis-tributed samples over the sphere is needed. This can be done with two canonical valuesξx and ξy in the following manner:θ φ

2arccos

1 ξx 2πξy

The probability function for these random variables is defined as px 1

4π . Because of

23

Page 32: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

the fact that the probability function is constant it can be broken out of the integral.

Hence, the integral over the sphere for the product of a function f and the spherical har-monics basis functions is defined as:

cml 2π

0 π0 f

θ φ Y m

l

θ φ sinθdθdφ 1

N

N

∑j 1

fθ j φ j Y m

l

θ j φ j w

θ j φ j

where the weighting function is wθ j φ j 1

p x 4π.

Note that the term Y ml

θ j φ j will be the same no matter what function is being projected.

This means that the spherical samples can be precalculated along with the spherical har-monics coefficients into a simple sample array.

The algorithm for projecting a spherical function fn into spherical harmonics coefficientscan be defined as follows [6]:

for i = 0..nsamples-1v = fn( samples[i].theta, samples[i].phi )for n = 0..ncoeff-1

result[n] = result[n] + v * samples[i].coeff[n]for i = 0..ncoeff-1

result[i] = result[i] * 4 πnsamples

Reconstruction of the original function is now simply the spherical harmonics basis func-tions weighted by the coefficients calculated above.

4.4 Lighting diffuse surfaces with spherical harmonics

To calculate the color of a point on some surface, the integral of the product of someillumination (lighting) function and a description of the surface reflectance function, theBRDF or transfer function, is calculated.

ΩL ω t ω dω

L is the lighting function and t the transfer function.

If both the lighting function and the transfer function is projected to the spherical har-monics basis, the orthogonality of the basis functions reduces the integral to a dot productbetween the coefficient vectors.

ΩL ω t ω dω

n2

∑i 0

Liti

All lighting is therefore reduced to a dot product for the light source [12, 16, 6]. This hasthe same computational complexity as direct illumination using the Phong model, withthe addition of global illumination effects. The drawback is that the transfer function needto be projected into spherical harmonics, which is an expensive procedure.

24

Page 33: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

4 SPHERICAL HARMONIC LIGHTING - IN DEPTH April 2004

For each vertex, a transfer function [12, 16] need to be calculated and projected to spher-ical harmonics. There are three different types of transfer functions defined for diffusesurfaces. These are unshadowed transfer, shadowed transfer and interreflected transfer.

4.4.1 Diffuse unshadowed transfer

To define a transfer function for an unshadowed diffuse surface, the rendering equation issimplified to the bare minimum [6]:

Lx ωo

Ω

frx ωi ωo Li

x ωi H

x ωi dωi

Since diffuse surfaces reflect light equally in all directions, the BRDF can be pulled outof the equation and the resulting equation for calculating the color becomes:

LDUx ωo

ρ x π Ω Lix ωi max

N ωi 0 dωi

where ρx is the surface color at the point x and N is the surface normal at the same

point.

Separating the lighting function from this equation results in the diffuse unshadowedtransfer function which is a pure geometric term [12, 16].

MDU maxN S j 0

where S j are our precalculated samples [6].

algorithm diffuse unshadowedfor i = 0 .. nsamples-1

h = samples i cartesian normalif h > 0

for j = 0 .. ncoeff-1result[j] = result[j] + h * samples[i].coeff[j]

for i = 0 .. ncoeff-1result[i] = result[i] * 4 π

nsamples

4.4.2 Diffuse shadowed transfer

By adding a visibility term to the equation, shadowing is achieved. The new equationbecomes [6]:

LDSx ωo

ρ x π Ω Lix ωi max

N ωi 0 V

ωi dωi

In the same sense above, if the lighting function is pulled out of the equation the result isthe shadowed transfer function, which is the geometric term multiplied with the visibilityterm [12, 16].

MDS VS j max

N S j 0

25

Page 34: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

The visibility term is the main difference between traditional computer graphics illumi-nation model (direct illumination) and global illumination solutions. It is usually calledoccluded ambient [6].

All that must be added to the algorithm for the unshadowed transfer function is an inter-section test with the world. If this return zero and the dot product is greater than zero, thenormal multiplied with the sample coefficients is added to the result.

4.4.3 Diffuse interreflected transfer

The most interesting part of the Rendering Equation is where it recursively adds in indi-rect light. The definition of the diffuse interreflected lighting equation can be defined asfollows:

LDIx LDS

x ρ x π Ω Li

x ωi 1 V

ωi maxN ωi 0 dωi

The transfer function for interreflected transfer is fairly complicated and not really inter-esting, but the algorithm for generating it is fairly simple to understand. It includes sixbasic steps [6].

1. Calculate the direct transfer functions for shadowed diffuse.2. Fire rays from the current point. If a surface is hit, linearly interpolate the sphericalharmonics coefficients for that surface using the coordinates of the hit point.3. Multiply the result with the dot product between the ray and the surface normal atthe origin of the ray and sum the result into a new spherical harmonics vector.4. After all rays have been cast, divide the new vector with the number of samplesand the weighting term.5. If more than one stage of interreflection is wanted, repeat the whole process withthe new vectors.6. Add direct transfer and the resulting vectors to form the new spherical harmonicsvector for a vertex.

This is usually called a probabilistic gathering solution, where each point searches blindlyfor neighbors and pulls in light from the neighbors it finds. It is quite possible to imple-ment a non-probabilistic version, with precalculation of which vertices are visible to eachother. A scattering solution is also possible. A small performance loss can be expectedwith a non-probabilistic solution since the spherical harmonics coordinates need to be cal-culated explicitly for each vertex pair, but since the most expensive part of the algorithmis the ray-model intersection test, the cost is relatively low.

4.5 Rendering spherical harmonics lighting

To calculate the resulting color of a vertex, calculate the dot product of the coefficientvector for the transfer function and the lighting function and use the resulting value as ascaling factor for the diffuse color of the vertex.

This operation is well suited for hardware accelerated rendering, using a vertex processor

26

Page 35: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

4 SPHERICAL HARMONIC LIGHTING - IN DEPTH April 2004

architecture on the graphics board to calculate the dot products.

If hard shadows aren’t necessary the original transfer function is well approximated bythe first two bands of spherical harmonics basis functions. This means that the coefficientvectors will have four values. The dot product of such vectors is usually a one stageoperation in modern graphics hardware [6].

The resulting color calculation replaces ambient and diffuse lighting and adds shadows tothe direct illumination model.

Of course, spherical harmonics lighting can be mixed with other types of lighting ina multi-pass system. The most useful notion is to add a specular term on the side ofspherical harmonics lighting to get a closer approximation of the Rendering Equation forall kinds of materials.

27

Page 36: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

28

Page 37: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

5 IMPLEMENTATION April 2004

5 Implementation

5.1 OpenSceneGraph - the framework

The goal of this thesis is to build a rendering and lighting system on top of OpenScene-Graph, which is an open source scene graph. OpenSceneGraph [14] is a portable high-level graphics framework built on top of the low-level graphics library OpenGL.

OpenSceneGraph is built to provide a high-performance, portable scalable graphics frame-work [13]. To increase performance, OSG supports numerous culling methods, state sort-ing, low-level graphics hardware acceleration techniques and display lists. This makesOSG one of the fastest general purpose scene graphs on the market.

Because OpenSceneGraph mostly depend on Standard C++ and OpenGL and nothingelse, it is very portable. The core scene graph is completely windowing system indepen-dent, but most windowing systems have wrappers that are easily downloaded from theOpenSceneGraph homepage.

Not only does OSG have support for numerous hardware acceleration techniques, it alsohas support for multiple rendering pipelines. This is because the core scene graph havesupport for multiple graphics contexts.

There are basically three different ways to add, or change, behavior in a scene graph.The first is the use of the design pattern Visitor to traverse the scene graph and do thingswith it. The second is the use of localized callbacks for the three basic stages in eachframe: update, cull and render. The third is to use different OpenGL rendering states fordifferent nodes. Since OSG supports state sorting this won’t really result in any significantperformance drops.

5.2 System design

The system is basically divided into four different parts: spherical harmonics evaluation,a basic raytracer, calculation of the transfer functions and rendering of the scene.

The first part, the spherical harmonics evaluation code, is divided into two classes, Vectorand Sample. The Vector wraps a spherical harmonics coefficient vector, which is basicallya copy of an ordinary three-dimensional floating-point vector with the difference that thenumber of elements in the float array is determined at runtime. A static variable in theVector class determines the number of bands of spherical harmonics functions to use, andhence also the number of coefficients.

The Sample class encapsulates everything concerning a spherical sample, along with itsSH coefficients and its spherical coordinates. All code to evaluate a polar function anda transfer function sample is encapsulated in this class. The Sample class also containsstatic methods to generate a new set of samples and to store them.

The second part is a simple raytracer, that is used to check if a ray intersects with a

29

Page 38: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

subgraph at some point. The coordinates of the point are not necessary, only if there wasa hit. The most important class here is the HitVisitor, which is a visitor used on a scenegraph that traverses the graph and determines if a specified ray hits any polygon in thescene. HitVisitor::Ray is a simple encapsulation of a ray, and contains all the intersectiontesting between a ray and bounding volumes. The Ray class uses a HitFunctor and thesubclass PolyIntersect to check for intersections between the ray and a polygon.

The third part is the transfer function calculation code, and consists of a single class,CalcVisitor. This visitor is used to traverse the graph and for each vertex calculate itstransfer function coefficients and save these to file.

The fourth part is the rendering of a scene using spherical harmonic lighting. A simplevertex program is used to calculate the dot product between the coefficient vectors. Lightobjects are defined by deriving from the base class LightNode, which encapsulates all thatis needed to update its lighting parameters. If several light sources are needed however,some tweaking will be needed, but that is beyond the scope of this thesis.

The light function coefficients are sent to the vertex program as a local parameter. Tosetup the lighting a visitor is used, SetupVisitor, which creates a vertex program, andtraverses the graph and adds the vertex program to all geometries as a rendering state.The transfer function coefficients are sent to the vertex program as vertex attributes.

The dependencies in the system can be viewed in the following graph:

Figure 9: Class dependencies

30

Page 39: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

5 IMPLEMENTATION April 2004

5.3 System usage

To calculate the transfer function coefficients for a model, all that is needed is to initializethe spherical sample list, and then call the CalcVisitor on the graph:

Example 1 Calculation of transfer coefficientsSH::Sample::Init();SH::CalcVisitor shc( rootNode, modelFileName, shadow );rootnode->accept( shc );

To set rendering up to use spherical harmonic lighting the SetupVisitor must be calledon the scene graph. A light node must also be defined, updated to use the correct vertexprogram and evaluated to use the correct light function coefficients. An example of use:

Example 2 Rendering using SH lightingSH::Sample::Init();SH::SetupVisitor shv( modelFileName );rootnode->accept( shv );CIEGeneralModel* light = new vrutils::CIEGeneralModel( rootnode );light->setVertexShader( shv.getVertexShader() );light->Update();

The light node used in example 2 is defined later in chapter 6.3 about analytical daylightmodels. Some sort of light node control should be created, perhaps as an update callbackor an event handler, but that is left as later exercise.

31

Page 40: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

32

Page 41: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

6 LIGHTING MODELS April 2004

6 Lighting models

6.1 Implementing light with spherical harmonics

In spherical harmonics any spherical function can be used as a light source. A syntheticlight source can be generated directly as spherical harmonics coefficients, a angle mapprojection of an HDR image can be used as a light source or analytical model defined onthe sphere can be used as light sources [6].

A nice type of light source is to use an HDR image [6]. An HDR image is an RGB bitmapcaptured by a special camera. By using available software the image can be converted intoan angle map projection, which is easily plugged into the spherical harmonics projectionalgorithm. All that is needed is to write a wrapping function that converts sphericalcoordinates into texture coordinates, a process which is fairly simple to derive.

Synthetic light sources can be modeled by generating spherical harmonics coefficients di-rectly. By symbolically integrating a function f

θ φ [6], the formulas for the coefficients

can be found. The function f should return one if theta is less than a specified threshold,and zero if the angle is greater:

ft θ φ

1

t θ 0

0 otherwise

The big advantage of using spherical harmonics is that it is very easy to model analyticallow-frequency models, such as analytical models for daylight and for the night sky. Asthis is the focus of this thesis, a number of such models will be described below.

6.2 Analytic models

The standards body CIE have developed several analytical models for daylight [8, 9].These models all have one thing in common. They are all based on the suns position onthe hemisphere and the luminance at zenith.

Calculating the suns position on the hemisphere (or in horizontal coordinates, altitude andazimuth) is actually quite involved [15]. The horizontal coordinates of the sun dependson 3 factors, longitude and latitude of the observer and the solar time (often named juliandate).

The julian date is the number of days that has passed since January 1st, 4713 B.C. It canbe calculated with the following formula:

julian 1720996 5

year 100

year 400 year 365 25

30 6001 month

1 day

hour

minute

second 60 60 24

By using the julian date, the ecliptic longitude can be found for the sun and then byconverting the ecliptic longitude to equatorial coordinates (declination and right ascen-sion), all that is needed to calculate the suns position on the hemisphere has been found.The equatorial coordinates are simply projected into horizontal coordinates and the suns

33

Page 42: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

position has been calculated.

The altitude (θs) represents the solar angle from zenith and is defined in the range 0 π ,where angles below π

2 is above the horizon and greater angles are below the horizon.

Positive azimuthal angles (φs) represents angles west of south.

Good candidates for parameterization in this model of the sun is the julian date and thegeographic position of the observer.

The second value we need is the luminance at zenith. This value depends on the positionof the sun (the altitude θs more specifically) and a description of the turbidity of the sky.

The model used in this thesis was developed by Norio Igawa, Hiroshi Nakamura andKunio Matsuura [7] using regression analysis of real atmospheric data sets.

The luminance at zenith can be defined as a function of the sky turbidity:

Lz eA T 5 B T 4 C T 3 D T 2 E T F

where T is the atmospheric turbidity in the range [0,1] where 0.2 represents an overcastsky and 1.0 represents a clear sky.

The coefficients A to F are defined as follows:

A = 18.373*a + 9.955B = -52.013*a - 37.766C = 46.572*a + 59.352D = 1.691*a2 - 16.498*a - 48.670E = 1.124*a + 19.738F = 1.170*ln(a) + 6.369

where a is the solar angle above the horizon ( π2 θs).

It should be noted that this function is only defined on solar positions above the horizon(which makes sense for a daylight model).

6.2.1 Older CIE models

The CIE standards body developed three models for daylight in 1996 [8, 9]. They wereprimarily developed for judging architectural designs, but they fit well into an analyticalsystem for computation of daylight illumination. The standard is available from CIE forapproximately ten euro.

The simplest of the three models is the Overcast Sky model (see figure 10). This modelonly depends on the luminance at zenith and the position on the hemisphere we wish tocalculate the luminance at. It is defined as follows:

Lp Lz1 2sinθp

3

where Lp is the luminance at the chosen point, Lz is the luminance at zenith and θp is thealtitude of the chosen point.

34

Page 43: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

6 LIGHTING MODELS April 2004

Figure 10: CIE Overcast Sky

The Clear Sky model (see figure 11) is more complicated, as it also depends on the posi-tion of the sun compared to the point we wish to calculate. It is defined as:

Lp Lz

0 91

10e

3γ 0 45cos2γ 1 e

0 32cosθp

0 91

10e 3θs

0 45cos2θs 1 e 0 32 where variables with subscript s are the spherical coordinates of the sun and variableswith subscripts p are the spherical coordinates of the chosen point. The variable γ is theplanar angle between the solar vector and the point vector. It can be computed by usingthe dot product of the vectors.

Figure 11: CIE Clear Sky

The Partly Overcast Sky model (see figure 12) is a combination of the other two modelsand is defined as follows:

Lp Lz

0 526

4e1 5γ 1 e

0 8cosγ

0 526

5e 1 5θs 1 e 0 8 35

Page 44: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

Figure 12: CIE Partly Overcast Sky

The only candidate for parameterization (that has not been considered before) is the at-mospheric turbidity factor. A small problem is at what levels the currently used modelshould be changed. More about this later.

6.2.2 CIE General Sky model

In 2003 a new model, the CIE Standard General Sky model [9], was accepted as a newstandard by CIE. This new model is a result of an international consensus on luminancedistribution in the atmosphere and standardization.

The luminance at a specific point on the hemisphere is defined as:

Lp Lz f γ p θp f θs p 0

where the functions f and p are defined as follows:

fx 1

c

ed x

ed π2

e cos2x

px 1

a e

bcosx

The parameters a to e used in the functions above are defined in a table with fifteengroups of different values, where group one corresponds to the old CIE Overcast Skymodel, while groups twelve to fifteen correspond to the old CIE Clear Sky model. Thetable of parameters can be found in an article by Stanislav Durala and Richard Kittler [4].

With this model it becomes easier to select which group of parameters to choose. Theturbidity parameter is simply multiplied by fifteen and used as an index into the table.

6.3 Integration with framework

A light source should be defined as a spherical function, so it can be point evaluated andaveraged. To define a light source, derive the class from LightNode and overload the

36

Page 45: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

6 LIGHTING MODELS April 2004

evalPoint method. This method takes spherical coordinates and returns a floating-pointvalue.

Both the analytical models are implemented as light sources derived from LightNode.Both of these models use the sun as an argument. Therefore the sun is defined with itsown class, Sun. This class encapsulates all that is needed to calculate the position of thesun on the hemisphere.

The two classes defined for the analytical models, CIESkyModel for the older CIE mod-els, and CIEGeneralModel for the new CIE model, are identical when considered froman API angle. There are only some internal differences which won’t be explained here.See the previous chapters (6.2.1 and 6.2.2) for the differences.

Both of the models use the same class (Sun) for sun control, and the same zenith modelas described in chapter 6.2.

To control the models three methods in Sun are used, and three methods in the modelclasses themselves:

class Sun:

setLongitude( char direction, int degrees, int minutes, int seconds )setLatitude( char direction, int degrees, int minutes, int seconds )setTime( int year, int month, int day, int hour, int minute, int second, bool summer)

The direction parameter to the longitude and latitude control are ”E”, ”W” in the longitudecase and ”N”, ”S” in the latitude case.

class ModelsetTurbidity( float T )Sun& getSun()void Update()

The Update method will update the local vertex program parameters in the LightNode, sothe new parameters are used by the vertex program.

To see that the models actually work, an animation callback has been defined that simplyincreases the time, effectively making the sun move across the sky. This callback isCIECallback. To add this to the application, use example 3.

Example 3 Animation callbackvrutils::Model::CIECallback *callback

= new vrutils::Model::CIECallback( light );rootnode->setUpdateCallback( callback );

37

Page 46: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

38

Page 47: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

7 RESULTS April 2004

7 Results

7.1 Performance

The performance can be divided into two separate parts, the performance of the precalcu-lation stage and the performance of the real-time rendering stage.

The performance of the precalculation is dependent upon three things: the number ofbands of spherical harmonics functions, the number of spherical samples used and theperformance of the raytracer (if shadows are included).

It is easy to realize that including shadows will be very expensive, since an intersectiontest with the entire graph need to be done for each vertex-sample pair. Because of thisthere are no real need to look at the performance difference between shadows and noshadows.

Which factor is the most limiting one when considering the number of samples and thenumber of bands?

0

100

200

300

400

500

600

700

800

900

1000

2 2.5 3 3.5 4 4.5 5

Cal

cula

tion

time

(s)

Number of bands

4000 vertices model with shadows

100 samples400 samples900 samples

1600 samples

Figure 13: 4000 vertices with shadows

From the graphs in figure 13 and 14 it can be seen that the number of bands used will havevery small effect on the calculation time. The most limiting factor by far is the numberof spherical samples used. If the calculation time without shadows is considered (figure15), it can be seen that the performance of the raytracer is by far the most limiting.

The performance of the real-time rendering stage is dependent on both hardware andsoftware factors. The hardware factors are hard to affect. The most important hardwarefactors limiting the performance here is the vertex program performance of the graphicscard and the driver, and also the speed of vertex attribute sending. The software factorscan basically be limited to only one, and that is the time it takes to recalculate the light

39

Page 48: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

0

2000

4000

6000

8000

10000

12000

2 2.5 3 3.5 4 4.5 5

Cal

cula

tion

time

(s)

Number of bands

50000 vertices model with shadows

100 samples400 samples

Figure 14: 50000 vertices with shadows

0

10

20

30

40

50

60

2 2.5 3 3.5 4 4.5 5

Cal

cula

tion

time

(s)

Number of bands

50000 vertices model without shadows

100 samples400 samples900 samples

1600 samples2500 samples

Figure 15: 50000 vertices without shadows

coefficients. This is not necessary to do each frame though (only when the light is actuallychanged), so in most cases this wont be a big factor. The results below however, are basedon the fact that the light is updated each frame.

As can be seen in figures 16 and 17, the greatest effect on the frame rate is the updating ofthe sky models and the number of vertices in the models. But the number of bands (andhence the amount of data sent to the vertex program) also affects the frame rate to someextent.

40

Page 49: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

7 RESULTS April 2004

0

100

200

300

400

500

600

2 2.5 3 3.5 4

Fra

mer

ate

Number of bands

4000 vertices model, 100 samples

sun animatedno animation

Figure 16: 4000 vertices

0

20

40

60

80

100

120

140

160

180

2 2.5 3 3.5 4

Fra

mer

ate

Number of bands

4000 vertices model, 100 samples

sun animatedno animation

Figure 17: 50000 vertices

41

Page 50: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

7.2 Visible results

Because of the subjective nature of the human eye, the visible results are hard to discuss.The only thing that can be done is to supply a number of screenshots.

This is a model using 100 spherical samples and four bands of spherical harmonics func-tions. The first picture is sunrise, the second slightly after noon and the third is sunset.The model has approximately 4000 vertices.

42

Page 51: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

7 RESULTS April 2004

The following model is a more complicated model with 50000 vertices. The number ofsamples is still 100 and the number of bands is four.

As can be seen in the pictures, the shadows and light move as the sun does.

43

Page 52: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

44

Page 53: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

8 DISCUSSION April 2004

8 Discussion

8.1 Motivation

The reason why global illumination was used and not some direct illumination modelcoupled with a shadowing algorithm, is that to correctly model daylight the full hemi-sphere need to included for each vertex. Therefore the integral over the hemisphere isneeded, and thus a global illumination model.

Spherical harmonic lighting was chosen because daylight is low-frequent by nature, andspherical harmonics work well in such situations. Analytical models such as those usedin this thesis are hard to model with raytracing and photon mapping, and because the lightchanges in real-time, radiosity will be too slow to use.

The CIE models were chosen simply because they are the industrial standards for definingdaylight. They are very simple to use and very well-defined.

8.2 Performance and results

It might not be visible in the screenshots in the previous chapter (7.2), but the visibleresults start to degrade when using below approximately 200 samples per vertex. Theshadows begin showing aliasing effects. For better results over 1000 samples should beused. This is unfeasible for large models however. The approach used in this thesis is touse only 100 samples, simply because in models with more, and smaller polygons, theseeffects will be less apparent. The number of bands of spherical harmonics used only affectthe hardness of the shadows. With more coefficients the shadows are closer to reality,showing somewhat more hardness than is the fact with a low number of coefficients.

Something that can be done do increase the performance of both stages, is to optimizethe models. The raytracer is very important for the precalculation stages, and can be builtusing the standard raytracer optimization techniques to increase performance.

The visible results can be odd at times, because of the fact that the luminance values arelinearly scaled and used directly. Since the luminance values aren’t linearly related tobrightness, this will cause oddities at times. To fix this a tone reproduction techniqueshould be used to convert the luminance into brightness [2].

45

Page 54: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

9 Acknowledgments

First, I would like to thank Ola Ågren for helping me out with the report. I would alsolike to thank Anders Backman for his feedback and help during the work on this masterthesis.

Finally, I would like to thank my family and friends for their support when things weren’tworking out so well.

46

Page 55: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

REFERENCES April 2004

References

[1] Michael F. Cohen and Donald P. Greenberg. The hemi-cube: A radiosity solution forcomplex environments. In Proceedings of the 12th annual conference on Computergraphics and interactive techniques, San Francisco, California, USA, pages 31–40.ACM Press, July 22–26 1985.

[2] Kate Devlin, Alan Chalmers, Alexander Wilkie, and Werner Purgathofer. ToneReproduction and Physically Based Spectral Rendering. State of the Art Reports,Eurographics 2002, Saarbücken, Germany, pages 101–123, September 2–6 2002.

[3] Steven M. Drucker. Radiosity: An Illuminating Perspective. General exam summa-tion, Media Laboratory, MIT, 1991.

[4] Stanislav Durala and Richard Kittler. CIE General Sky Standard Defining Lumi-nance Distributions. In Proceedings eSim 2002, Montreal, Canada, September 11–13 2002.

[5] Cindy M. Goral, Kenneth E. Torrance, Donald P. Greenberg, and Bennett Battaile.Modeling the interaction of light between diffuse surfaces. In Proceedings of the11th annual conference on Computer graphics and interactive techniques, Min-neapolis, Minnesota, USA, pages 213–222. ACM Press, July 23–27 1984.

[6] Robin Green. Spherical Harmonic Lighting: The Gritty Details. Pdf document,Sony Computer Entertainment America, 2003. http://www.research.scea.com/-gdc2003/spherical-harmonic-lighting.pdf.

[7] Norio Igawa, Hiroshi Nakamura, and Kunio Matsuura. Sky Luminance DistributionModel for Simulation of Daylit Environment. In Proceedings of Building Simulation1999, Kyoto, Japan, volume 2, pages 969–975, September 13–15 1999.

[8] International Commission on Illumination. ISO 15469/CIE S003: Spatial distribu-tion of daylight - CIE standard overcast sky and clear sky, 1996.

[9] International Commission on Illumination. CIE DS 011.1/E-2001: Spatial distribu-tion of daylight - CIE standard general sky, 2001.

[10] Henrik Wann Jensen. Rendering Caustics on Non-Lambertian Surfaces. ComputerGraphics Forum, 16(1):57–64, 1997.

[11] Timothy R. Jozwowski. Real Time Photon Mapping. Masters’ Thesis, MichiganTechnological University, May 2002.

[12] Jan Kautz, Peter-Pike Sloan, and John Snyder. Fast, arbitrary BRDF shading forlow-frequency lighting using spherical harmonics. In Proceedings of the 13th Euro-graphics workshop on Rendering, Pisa, Italy, pages 291–296. Eurographics Asso-ciation, June 26–28 2002.

47

Page 56: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

[13] Robert Osfield. Introduction to the OpenSceneGraph. Webpage, January 2004.http://openscenegraph.sourceforge.net/introduction/index.html.

[14] Robert Osfield. OpenSceneGraph Homepage. Webpage, January 2004.http://openscenegraph.sourceforge.net/.

[15] A. J. Preetham, Peter Shirley, and Brian Smits. A practical analytic model fordaylight. In Proceedings of the 26th annual conference on Computer graphicsand interactive techniques, Los Angeles, California, USA, pages 91–100. ACMPress/Addison-Wesley Publishing Co., August 8–13 1999.

[16] Peter-Pike Sloan, Jan Kautz, and John Snyder. Precomputed radiance transfer forreal-time rendering in dynamic, low-frequency lighting environments. ACM Trans.Graph., 21(3):527–536, 2002.

[17] Turner Whitted. An Improved Illumination Model for Shaded Display. Communi-cations of the ACM, 23(6):343–349, 1980.

[18] Chris Wynn. An Introduction to BRDF-Based Lighting. Webpage, 2002.http://www.nvidia.com/attach/1473.

48

Page 57: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Index Entries

anisotropic, 5

Appel,

Arthur, 11

BRDF, 5, 4–5, 8

Bui-Tuong,

Phong, 8

CIE, 33, 34

CIE

Clear Sky model, 34

Overcast Sky model, 34

Partly Overcast Sky model, 35

Standard General Sky model, 36

Cohen, 13

color bleeding, 6, 12, 16

conservation of energy, 5

Contour Integral evaluation, 13

differential solid angle, 6

ecliptic longitude, 33

equatorial coordinates, 33

flat shading, 8

form factor, 13, 13

Gouraud shading, 8, 13

HDR image, 33

hemicube evaluation, 13–14

horizontal coordinates, 33

isotropic, 5

Jozwowski,

Timothy, 16

julian date, 33

Kautz,

Jan, 16

Large Numbers

Law of, 23

Law of Large Numbers, 23

Law of Reflection, 4, 15

Legendre polynomials, 19, 21

Legendre polynomials,

Associated, 17, 19, 19–21

Monte-Carlo integration, 12, 23, 23

occluded ambient, 25

OpenSceneGraph, 29

Phong model, 8

Phong shading, 8

photon mapping, 9–10, 15–16

progressive radiosity, 14

radiosity, 9, 12–15

raytracing, 9, 11–12, 16

reciprocity, 5

Rendering Equation, 7, 7, 26

Sloan,

Peter-Pike, 16

Snyder,

John, 16

spherical harmonics, 10, 21, 16–27, 33

49

Page 58: Real-Time Rendering of Natural Illumination · rendering natural illumination in outdoor scenes. Lately, researchers have begun examining outdoors rendering in greater detail. One

Simon Rönnberg Natural Real-Time Illumination

tone reproduction, 45

Wann Jensen,

Henrik, 15, 16

Whitted,

Turner, 11

50