CS361

28
CS361 Week 6 - Wednesday

description

Week 6 - Monday. CS361. Last time. What did we talk about last time? Visual appearance Lights Materials Sensors. Questions?. Project 2. Week 1: Color and XNA. RGB. We will primarily focus on the RGB system for representing color - PowerPoint PPT Presentation

Transcript of CS361

Page 1: CS361

CS361Week 6 - Wednesday

Page 2: CS361

Last time

What did we talk about last time? Light Material Sensors

Page 3: CS361

Questions?

Page 4: CS361

Project 2

Page 5: CS361

Back to Sensors

Page 6: CS361

Real sensors In general, sensors are made

up of many tiny sensors Rods and cones in the eye Photodiodes attached to a CCD

in a digital camera Dye particles in traditional film

Typically, an aperture restricts the directions from which the light can come

Then, a lens focuses the light onto the sensor elements

Page 7: CS361

Radiance

Irradiance sensors can't produce an image because they average over all directions

Lens + aperture = directionally specific

Consequently, the sensors measure radiance (L), the density of light per flow area AND incoming direction

Page 8: CS361

Idealized sensors In a rendering system, radiance

is computed rather than measured

A radiance sample for each imaginary sensor element is made along a ray that goes through the point representing the sensor and point p, the center of projection for the perspective transform

The sample is computed by using a shading equation along the view ray v

Page 9: CS361

Student Lecture: Lambertian, Gouraud, and Phong Shading

Page 10: CS361

Shading

Page 11: CS361

Shading equations

We need a mathematical equation to say what the color (radiance) at a particular pixel is

There are many equations to use and people still do research on how to make them better

Remember, these are all rule of thumb approximations and are only distantly related to physical law

Page 12: CS361

Lambertian shading

Diffuse exitance Mdiff = cdiff EL cos θ Lambertian (diffuse) shading

assumes that outgoing radiance is (linearly) proportional to irradiance

Because diffuse radiance is assumed to be the same in all directions, we divide by π (explained later)

Final Lambertian radiance Ldiff = θcosdiffLEπ

c

Page 13: CS361

Specular shading

Specular shading is dependent on the angles between the surface normal to the light vector and to the view vector

For the calculation, we compute h, the half vector half between v and l

vlvlh

Page 14: CS361

Specular shading equation The total specular exitance is almost exactly

the same as the total diffuse exitance: Mspec = cspec EL cos θ

What is seen by the viewer is a fraction of Mspec dependent on the half vector h

Final specular radiance Lspec =

Where does m come from? It's the smoothness parameter

θcoscos88

spec Lhm Eφ

πm

c

Page 15: CS361

Implementing the shading equation Final lighting is:

We want to implement this in shaders

The book goes into detail about how often it is computed Note that many terms can be

precomputed, only the ones with angles in them change

n

iLh

m EmLi

1ispec

diff θcoscos88)( ccv

Page 16: CS361

When should it be computed? Computing the shading equation more often gives better

visual results but takes more time Flat shading

Computes shading equation once per primitive Gouraud shading

Computes shading equation once per vertex, linearly interpolates color for pixel values

Phong shading Computes color per pixel

Page 17: CS361

Aliasing

Page 18: CS361

Aliasing When sampling any continuous thing (image,

sound, wave) into a discrete environment (like the computer), multiple samples can end up being indistinguishable from each other

This is called aliasing We can reduce aliasing by carefully

considering how sampling and reconstruction of the signal is done

Page 19: CS361

Aliasing example Ever seen wheels of a car spinning the wrong way? Without enough samples, it may be impossible to tell

which way it's spinning

You need a sampling frequency twice as high as the maximum frequency of the events to reconstruct the original signal

Called the Nyquist limit

Page 20: CS361

Screen based antialiasing Jaggies are caused by insufficient sampling A simple method to increase sampling is full-

scene antialiasing, which essentially renders to a higher resolution and then averages neighboring pixels together

The accumulation buffer method is similar, except that the rendering is done with tiny offsets and the pixel values summed together

Page 21: CS361

FSAA schemesA variety of FSAA schemes exist with different tradeoffs between quality and computational cost

Page 22: CS361

A-buffer For non-interactive render

speeds, the A-buffer can be used

The A-buffer generates a coverage mask for each fragment for each pixel

Fragments are thrown away if they have z-buffer values that are higher than fragments with full coverage

Final pixel color is based on fragment merging

Page 23: CS361

Multisample antialiasing Supersampling techniques (like FSAA) are very

expensive because the full shader has to run multiple times

Multisample antialiasing (MSAA) attempts to sample the same pixel multiple times but only run the shader once Expensive angle calculations can be done once

while different texture colors can be averaged Color samples are not averaged if they are off the

edge of a pixel

Page 24: CS361

Performance, speed, the future Active research is still trying to find techniques with good

visual output and good computational performance Stochastic (random) sampling reduces the visual

repetition of some artifacts

Sharing samples between pixels can reduce overall cost

Page 25: CS361

SharpDX Examples

Page 26: CS361

Upcoming

Page 27: CS361

Next time…

Review for Exam 1 Review all material covered so farExam 1 is Friday in class

Page 28: CS361

Reminders

Keep working on Project 2, due Friday, March 1

Keep reading Chapter 5 Start reading Chapter 6