blinn

download blinn

of 7

Transcript of blinn

  • 7/27/2019 blinn

    1/7

    SIMULATION OF WRINKLED SURFACES

    James F. BlinnCaltech/JPL

    Abstract

    Computer generated shaded images have reached an impressivedegree of realism with the current state of the art. They arenot so realistic, however, that they would fool many people intobelieving they are real. One problem is that the surfaces tend

    to look artificial due to their extreme smoothness. What isneeded is a means of simulating the surface irregularities thatare on real surfaces. In 1973 Ed Catmull introduced the idea ofusing the parameter values of parametricallydefined surfaces toindex into a texture definition function which scales theintensity of the reflected light. By tying the texture patternto the parameter values, the texture is guaranteed to rotate andmove with the object. This is good for showing patterns paintedon the surface, but attempts to simulate rough surfaces in thisway are unconvincing. This paper presents a method of using atexturing function to perform a small perturbation on thedirection of the surface normal before using it in the intensitycalculations. This process yields images with realistic lookingsurface wrinkles without the need to model each wrinkle as aseparate surface element. Several samples of images made withthis technique are included.

    1. INTRODUCTION

    Recent work in computer graphics has beendevoted to the development of algorithms formaking picturesof objects modelled by other thanthe conventional polygonal facet technique. Inparticular, several algorithms [4,5,7] have beendevised for making images of parametric surfacepatches. Such surfaces are defined by the valuesof three bivariate functions:

    X = X(u,v)

    Y = Y(uv)

    Z = Z(u,v)

    as the parameters vary between 0 and 1. Suchalgorithms basically consist of techniques forinverting the X and Y functions. That is, giventhe X and Y of a picture element, thecorresponding u and v parameter values are found.This parameter pair is then used to find the Zcoordinate of the surface to perform depthcomparisons with other objects. The intensity ofthe resultant picture element is then found by asimulation of the light reflecting off thesurface. Functions for performing thiscomputationare described in [3].

    The prime component in the calculation of theintensity of a picture element is the direction ofthe surface normal at that picture element.

    Tocalculate the surface normal we first examine thederivatives of the surface definition functions.If the coordinates of a point on the patch isrepresented by the vector P:

    = (X,Y,Z)

    The partial derivativesof these functions formtwo new vectors which we will call Pu and Pv.

    Pu = (Xu,Yu,Zu)

    Pv = (Xv,Yv,Zv)

    These two vectors define a plane tangent tosurface at that point. Their cross producthus a vector normal to the surface.

    N = Pu x Pv

    These vectors are illustrated in figure 1. Befusing the normal in intensity calculationsit mfirst be scaled to a length of 1.0 by dividingits length.

    Figure 1 - Definition of Normal Vector

    Images of smooth surfaces made directly fthe patch description do not have the usartifacts associated with polygonal facets, t

    do indeed look smooth. In fact they sometilook too smooth. To make them look lartificial it is necessary to simulate some ofsurface irregularities of real surfaces. Catm[5] made some progress in this direction wprocess called texture mapping. Effectivelycolor of the surface was defined as a foubivariate function,C(u,v), and was used to scthe intensity of the generated picture at epoint. This technique was good a generatpicturesof objects with patterns painted on thIn order to simulate bumpy or wrinkly surfacesmight use, as the defining texture patterndigitized photograph of a bumpy or wrin

    286

  • 7/27/2019 blinn

    2/7

    surface. Attempts to do this were not verysucessful. The images usually looked like smoothsurfaces with photographs of wrinkles glued on.The main reason for this is that the light sourcedirection when making the texture photographwasrarely the same as that used when synthesizing theimage. In fact, if the surface (and thus themapped texture pattern) is curved, the angle ofthe light source vector with the surface is noteven the same at different locations on the patch.

    2. NORMAL VECTOR PERTURBATION

    To best generate images of macroscopicsurface wrinkles and irregularities we mustactually model them as such. Modelling eachsurface wrinkle as a separate patch would probablybe prohibitivelyexpensive. We are saved fromthis fate by the realization that the effect ofwrinkles on the perceived intensity is primarilydue to their effect on the direction of thesurface normal (and thus the light reflected)rather than their effect on the position of thesurface. We can expect, therefore, to get a goodeffect from having a texturing function whichperforms a small perturbation on the direction ofthe surface normal before using it in theintensity formula. This is similar to thetechnique used by Batson et al. [1] to synthesizeaerial picutres of mountain ranges fromtopographic data.

    The normal vector perturbation is defined interms of a function which gives the displacementof the irregular surface from the ideal smoothone. We will call this function F(u,v). On thewrinkled patch the position of a point isdisplaced in the direction of the surface normalby an amount equal to the value of F(u,v). Thenew position vector can then be written as:

    P' = P + F N/INIThis is shown in cross section in figure 2.

    The partial derivatives involved are evaluated bythe chain rule. So

    Pu' = d/du P' = d/du(P + F N/INI)= Pu + Fu N/INI + F (N/INI)u

    Pv' = d/dv P' = d/dv(P + F N/INI)= Pv + Fv N/INI + F (N/INI)v

    The formulation of the normal to the wrinkled

    surface is now in terms of the original surfacedefinition functions, their derivatives, and thebump function, F, and its derivatives. It is,however, rather complicated. We can simplifymatters considerably by invoking the approximationthat the value of F is negligably small. This isreasonable for the types of surface irregularitiesfor which this process is intended where theheight of the wrinkles in a surface is smallcompared to the extent of the surface. With thissimplification we have

    Pu' Pu + Fu N/INIPv' Pv + Fv N/INI

    The new normal is then

    N' = (Pu + Fu N/NI) x (Pv + Fv N/NI)

    = (Pu x Pv) + Fu (N x Pv)/INI

    + Fv (Pu x N)/INI + Fu Fv (NxN)/lNIThe first term of this is, by definition,N. Thelast term is identically zero. The net expressionfor the perturbed normal vector is then

    N' =N+

    where D = (Fu (N x Pv) - Fv (N x Pu) )/ INI

    This can be interpreted geometrically by observingthat (N x Pv) and (N x Pu) are two vectors in thetangent plane to the surface. An amount of eachof them proportional to the u and v derivatives ofF are added to the original, unperturbed normalvector. See figure 3

    Another geometric interpretation is that thevector N' comes from rotating the original vectorN about some axis in the tangent plane to thesurface. This axis vector can be found as thecross product of N and N'.

    287

  • 7/27/2019 blinn

    3/7

  • 7/27/2019 blinn

    4/7

    This is easily accomplished by masking off all butthe low 6 bits of the IU and IV values. This alsomakes it easy to have'the table represent a unitcell pattern to be replicated many times perpatch. The function values U and V are merelyscaled up by the replication count before beingpassed to FVAL.

    Now that we know what to do with the tableentries we turn to the question of how to generatethem in the first place. Some simple geometricpatterns can be generated algorithmically. Onesuch is a gridwork of high and lw values. Thetable entries of the F function for such a gridare shown plotted as a 3D line drawing in figure5. The result when mapped onto a flat patch withone corner bent back is also shown.

    Figure 5 - Simple Grid Pattern

    Embossed letters can be generated by using abit-map character set as used to'display text on araster scan display. Such a texture array appearsin figure 6. This pattern was used to make thetitle on the ribbon on the logo of the cover ofthese proceedings.

    Figure 6 - Embossed Letter Pattern

    239

    Another method of generating bump functionsderives from image synthesis algorithms which useZ-buffers or depth buffers to perform the hiddensurface comparisons [5]. The actual Z values leftin the depth buffer after running such an

    algorithm can be used to define the table entriesfor a bump function. In figure 7 an image of asphere was generated using such an algorithm andthe resultant Z-buffer replicated several times togenerate the rivet-like pattern. This is the

    pattern mapped onto the cube on the cover logo.Similarly, a 3D character set was used with aZ-buffer algorithm to generate the pattern showingthe date also in figure 7. This was used on theribbon on the cover.

    Figure 7- Z-Buffer Patterns

    The most general method of generating bumpfunctions relies on video frame buffer technologyand its standard tool, the painting program.Briefly, a frame buffer is a large digital memorywith one word per picture element of an image. Avideo signal is continually synthesized fran thismemory so that the screen displays an image ofwhat is in memory. A painting program utilizes adigitizing tablet to control the alteration of thevalues in the memory to achieve the effect ofpainting on the screen. By utilizing a region ofthe frame buffer as the defining table of the Ffunction, a user can actually paint in thefunction values. The interpretation of the image

    will be such that black areas produce small valuesof F and white areas produce large values. Sinceonly the derivatives of F are used in the normalvector perturbation, any area of constantintensity will look smooth on the final image.However, places where the image becomes darkerwill appear as dents and places where it becomesbrighter will appear as bumps. (Thiscorrespondance will be reversed if the base patchis rotated to view the back side).of interesting patterns

    The generationwhich f it t og et he r

    end-to-end to form a continuous join betweenpatches then becomes primarily an artistic efforton the part of the drawer. Figure 8 shows some

  • 7/27/2019 blinn

    5/7

    sznple results that can be achieved with thistechnique. The first pattern, a hand drawn unitcel l of bricks was mapped onto the sphere on thecover.

    Figure 8 Hand Drawn FunctionsFigure A- Hand Drawn Bump Funtions

  • 7/27/2019 blinn

    6/7

    4. DEPENDANCE ON SCALE

    One feature of the perturbation calculationis that the perturbation amount is not invariantwith the scale at which the object is drawn. Ifthe X, Y, and Z surface definiton functions arescaled up by 2 then the normal vector length, INI,

    scaled up by a factor of 4 while theperturbation amount, IDI, is only scaled by 2.This effect is due to the fact that the object is

    being scaled but the displacement function F isnot. (Scale changes due to the object movingnearer or farther from the viewer in perspective

    space do not affect the size of the wrinkles, onlyscale shanges applied directly to the object.) Thenet effect of this is that if an object is scaledup, the wrinkles flatten out. This is illustratedin figure 9.

    norma l stretched

    Figure 9- stretched Bump Texture

    This effect might be desirable for someapplications but undesirable for others. A scale

    invariant perturbation, D', must scale at the samerate as N. An obvious choice for this is

    D' = a DINI/IDI

    50 ID1 = a INI

    where a is independent of scales in P. The valueof a is then the tangent of the effective rotationangle.

    tan+' = ID'l/lNl = aThis can be defined in various ways. One simple

    choice is a generalization from the simple, flatunit square patch

    X(u,v) = uY(u,v) = vZ(u,v) = 0

    For this patch the original normal vectorperturbation gives

    N = (0,0,1)D = (-Fu,-Fv,0)tan+=sqrt(Fu'+Fv')

    Here the value of a is purely a function of F.Use of the same function for arbitrary patchescorresponds to a perturbation of

    a = sqrt(Fu'+Fv.')D' = a D lNl/ lDlN" = N + D'

    The texture defining function F is now no longerbeing used as an actual displacement added to theposition of the surface. It just serves toprovide (in the form if its derivatives) a meansof defining the rotation axis and angle asfunctions of u and v.

    5. ALIASING

    In an earlier paper 121, the author describedthe effect of aliasing on images made with colortexture mapping. The same problems can arise withthis new form. That is, undesirable artifacts canenter the image in regions where the texturepattern maps into a small screen region. Thesolution applied to color textures was to averagethe texture pattern over the region correspondingto each picture element in the final image. Thebump texture definition function, however, doesnot have a linear relationship to the intensity ofthe final image. If the bump texture is averagedthe effect will be to smooth out the bumps rather

    than average the intensities. The correctsolution to this problem would be to compute theintensities at some high sub-pixel resolution andaverage them. Simply filtering the bump functioncan, however, reduce the more offensive artifacts- -.o f aliasing. Figure 10 shows the result of suchan operation.

    Before:

    After

    Figure 10 - Filtering Bump Texture

    291

  • 7/27/2019 blinn

    7/7

    6. RESULTS

    Surfaces appearing in images made with thistechniquelook quite convincingly wrinkled. Anespecially nice effect is the interaction of the

    bumps with calculated highlights. We mustrealize, however, that the wrinkles are purelyillusory. They only come from some playing withthe parameters used in intensity calculations.They do not, for example, alter the smooth

    silhouette edges of the object. A useful test ofany image generation algorithm is to see how wellthe objects look as they move in animation

    sequences. Sane sample frames from such ananimation sequence appear in figure 11. The

    illusion of wrinkles continues to be convincingand the smoothness of the silhouette edges is notoverly bothersome.

    Some simple timing measurements indicate thatbump mapping takes about 4 times as long as Phongshading and about 2 times as long as color texturemapping. The pictures in this paper took from 3to 7 minutes each to produce.

    The author would like to thank Lance Williams

    and the New York Institute of Technology ComputerGraphics Laboratory for providing some of theartwork and assistance in preparing the logo onthe cover made with the techniques described inthis paper.

    REFERENCES

    [1]Batson R. M., Edwards, E. and Eliason,M. "Computer Generated Shaded ReImages", Jour, Research U.S. Geol. SurvVol. 3, No. 4, July-Aug 1975, p. 401-40

    [2] Blinn, J. F., and Newell, M. E., "Texand Reflection in Computer Generated Image

    CACM 19, 10, Oct 1976, pp 542-547.

    [3] Blinn, J. F., "Models of Light Reflection Computer Synthesized Pictures", Proc.Conference on Computer Graphics Interactive Techniques, 1977.

    [4] Blinn, J. F., "A Scan Line Algorithm Displaying Parametrically Defined SurfaceProc. 5th Conference on Computer Graphand Interactive Techniques, 1978.

    [5]Catmull, E. E., "Computer Display of CuSurfaces", Proc. IEEE Conf. on CompuGraphics, Pattern Recognition andStructures, Los Angeles (May 1975111.

    [6]Whitted, J. T., "A Scan Line Algorithm Computer Display of Curved Surfaces", Pr5th Conference on Computer Graphics Interactive Techniques, 1978.

    Figure 11 - Rotating Textured Sphere