Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based...

9
Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong Chuang Department of Computer Science, National Chiao Tung University, Taiwan Abstract Simulating realistic makeup effects is one of the important research issues in the 3D facial animation and cosmetic industry. Existing approaches based on image processing tech- niques, such as warping and blending, have been mostly applied to transfer one’s makeup to another’s. Although these approaches are intuitive and need only makeup images, they have some drawbacks, e.g., distorted shapes and fixed viewing and lighting conditions. In this paper, we propose an integrated approach, which combines the Kubelka-Munk model and a screen-space skin rendering approach, to sim- ulate 3D makeup effects. The Kubelka-Munk model is used to compute total transmittance when light passes through cosmetic layers, while the screen-space translucent rendering approach simulates the subsurface scattering effects inside human skin. The parameters of Kubelka-Munk model are obtained by measur- ing the optical properties of different cosmetic materials, such as foundations, blushes and lipsticks. Our results demonstrate that the proposed approach is able to render realistic cosmetic effects on human facial models and different cosmetic materials and styles can be flexibly applied and simulated in real time. Keywords: Skin rendering, translucent render- ing, cosmetic rendering 1 Introduction Realistic rendering of human faces with makeup is critical for many applications in 3D facial animation and cosmetic industry since facial makeup is one of the most common routines for many females or even for some males. Makeup is a multi-layered process: skin care products are usually applied first and some cosmetics, such as foundation, blush, lipstick and eye- shadow, are then smeared on one’s face. By smearing the cosmetics on one’s face, facial ap- pearance changes obviously. Several methods have been proposed to ren- der makeup effects. These methods can be di- vided into two categories: cosmetic transfer and makeup simulation. Cosmetic transfer requires examples of makeup images to transfer makeup effects. The transfer process can be done very efficiently since it is computationally inexpen- sive. Nevertheless, cosmetic transfer has some limitations. For instance, each example image needs to be taken under a similar viewing and lighting direction, and the style of makeup ef- fects is restricted by the examples. Furthermore, cosmetic transfer can only generate the makeup effects under a fixed viewing and lighting direc- tion. In contrast to cosmetic transfer, makeup simulation needs to know the optical properties of cosmetics and skins, such as scattering co- efficients and absorption coefficients, to simu- late makeup appearance. The optical properties are usually captured using special devices. Once the optical properties are captured, the style of makeup simulation can be freely set. In this paper, we aim to simulate the realis- tic rendering effect of human skin with cosmet- ics applied. A screen-space skin rendering tech- nique [1] and a physically-based Kubelka-Munk model [2] are employed to rendering cosmetic effects. Some cosmetic effects such as pearl and glitter effects are not handled in the proposed method since their optical properties are difficult to be measured. Our work makes the follow- ing two contributions: (1) Proposing a frame-

Transcript of Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based...

Page 1: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

Physically-based Cosmetic Rendering

Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong ChuangDepartment of Computer Science, National Chiao Tung University, Taiwan

AbstractSimulating realistic makeup effects is one ofthe important research issues in the 3D facialanimation and cosmetic industry. Existingapproaches based on image processing tech-niques, such as warping and blending, havebeen mostly applied to transfer one’s makeupto another’s. Although these approaches areintuitive and need only makeup images, theyhave some drawbacks, e.g., distorted shapesand fixed viewing and lighting conditions. Inthis paper, we propose an integrated approach,which combines the Kubelka-Munk model anda screen-space skin rendering approach, to sim-ulate 3D makeup effects. The Kubelka-Munkmodel is used to compute total transmittancewhen light passes through cosmetic layers,while the screen-space translucent renderingapproach simulates the subsurface scatteringeffects inside human skin. The parameters ofKubelka-Munk model are obtained by measur-ing the optical properties of different cosmeticmaterials, such as foundations, blushes andlipsticks. Our results demonstrate that theproposed approach is able to render realisticcosmetic effects on human facial models anddifferent cosmetic materials and styles can beflexibly applied and simulated in real time.

Keywords: Skin rendering, translucent render-ing, cosmetic rendering

1 Introduction

Realistic rendering of human faces with makeupis critical for many applications in 3D facialanimation and cosmetic industry since facialmakeup is one of the most common routines for

many females or even for some males. Makeupis a multi-layered process: skin care productsare usually applied first and some cosmetics,such as foundation, blush, lipstick and eye-shadow, are then smeared on one’s face. Bysmearing the cosmetics on one’s face, facial ap-pearance changes obviously.

Several methods have been proposed to ren-der makeup effects. These methods can be di-vided into two categories: cosmetic transfer andmakeup simulation. Cosmetic transfer requiresexamples of makeup images to transfer makeupeffects. The transfer process can be done veryefficiently since it is computationally inexpen-sive. Nevertheless, cosmetic transfer has somelimitations. For instance, each example imageneeds to be taken under a similar viewing andlighting direction, and the style of makeup ef-fects is restricted by the examples. Furthermore,cosmetic transfer can only generate the makeupeffects under a fixed viewing and lighting direc-tion. In contrast to cosmetic transfer, makeupsimulation needs to know the optical propertiesof cosmetics and skins, such as scattering co-efficients and absorption coefficients, to simu-late makeup appearance. The optical propertiesare usually captured using special devices. Oncethe optical properties are captured, the style ofmakeup simulation can be freely set.

In this paper, we aim to simulate the realis-tic rendering effect of human skin with cosmet-ics applied. A screen-space skin rendering tech-nique [1] and a physically-based Kubelka-Munkmodel [2] are employed to rendering cosmeticeffects. Some cosmetic effects such as pearl andglitter effects are not handled in the proposedmethod since their optical properties are difficultto be measured. Our work makes the follow-ing two contributions: (1) Proposing a frame-

Page 2: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

work that combines a screen-space skin render-ing method and the Kubelka-Munk model forcosmetic rendering; (2) Developing a makeupsimulation system that allows users to modifycosmetic styles and materials in real time.

2 Related Work

Cosmetic transfer. Cosmetic transfer aimsto transfer one’s makeup information to an-other’s. Such methods usually require one ormore makeup examples as the source cosmeticinformation. Tong et al. used two example im-ages of the source face representing the facialappearances before and after makeup to derivea cosmetic map [3]. The cosmetic map wasthen applied to an image of the target face totransfer the makeup effect. Guo and Sim pro-posed another approach to transfer cosmetic ef-fects based on one source image and one targetimage [4]. They applied image processing tech-niques to analyze the source image and utilizedthe obtained information to generate the final re-sult. These two methods both utilized 2D im-age warping, which might cause the shape ofmakeup to be distorted. They also needed tocapture the example images at the same or sim-ilar viewing condition.

Scherbaum et al. built a database of facialgeometry captured by structural lights and usedthe database to synthesize the makeup effect [5].Their database stored pairs of images of humanfaces without and with makeup. The informa-tion from a pair of images, such as position,surface normal, diffuse map, scattering profile,specular and the cosmetic map were extracted.In the run-time, the user input a 2D facial imageto retrieve a similar 3D face. The 3D face wasthen deformed to fit the input 2D image. Af-ter fetching the suitable face, the best matchedfacial appearance and cosmetic map in the prin-cipal component analysis (PCA) space were se-lected from the database. They then synthesizedthe cosmetic map and applied it to the diffusemap. This approach provided a good way totransfer makeup effect, but it still has some prob-lems. First, the style of makeup is fixed; Sec-ond, the system needs more hints to synthesizea plausible cosmetic map. Finally, the systemneeds many examples to produce the appearance

correctly.Cosmetic simulation. Moriuchi et al. used a

physical model and a statistical approach to sim-ulate foundation [6]. They measured the foun-dation from different viewing angles and cap-tured its reflectance between 400 nm and 700nm. They then fitted the data with the Cook-Torrance model to find the appropriate param-eters. Using Cook-Torrance model to describethe translucent material, however, is not suffi-cient, because it is originally used to describethe BRDF, not translucent materials. Therefore,they employed a statistical approach to analyzethe reflectance data using PCA. Doi et al. usedthe Kubelka-Munk model to render the founda-tion on human skin [7]. They measured the re-flectance of female skin and liquid foundationsand showed the estimated skin color and foun-dation makeup using a color palette. Althoughthey measured the skin reflectance, they did notapply it to 3D rendering. The translucent ap-pearance of skin was also not handled. In thispaper, we render cosmetic effects on 3D facialmodels using the diffusion approximation andthus can simulate the translucent effect.

Translucent and skin rendering. Humanskin appearances can be simulated by translu-cent rendering. Among different translucentrendering approaches, the diffuse approximationhas been used widely in recent years. Jensenet al. introduced the dipole diffuse approxima-tion to solve the radiant transfer equation [8].By assuming the thickness of the material is in-finite and the scattering coefficient σs is muchlarger than the absorption coefficient σa, theytransformed the complex computation of inte-rior behavior to a dipole function of distance andthus speeded up the computation process. Don-ner and Jensen extended the dipole approxima-tion by adopting the multi-pole function to ap-proximate thin-layered materials, such as skinand leaves [9]. After computing the reflectanceof each layer, they used multi-layered modelto combine each layer. However, they did notdeal with the measurement of the parameters ofmulti-pole model. The parameters acquired bythe dipole model may not be used in the multi-pole model either.

d’Eon et al. further proposed a real-timetexture-space algorithm for accelerating skinrendering [10]. They used sum-of-Gaussian to

Page 3: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

fit the diffuse profile and replaced the costly sur-face integration by multiple Gaussian-blurring.The computation can be done by GPU whilephysically plausible simulation is still retained.Jimenez et al. extended the texture-spacetechnique to screen-space by applying screen-space information to blur the result in multipletimes [1]. Krishnaswamy and Baranoski pro-posed a biophysically-based spectral model forskin parameter acquisition [11]. Donner et al.[12] and Ghosh et al. [13] also obtained the skinparameters by fitting the measurement data tothe appearances of human faces. All of thesemethods were designed only for skin rendering,not cosmetics. We focus on handling the com-bined rendering effects of cosmetics and skinlayers in this paper.

3 Approach

Facial makeup is a multi-layered structure. Itusually consists of the skin layer, the founda-tion layer, the blush layer, and the eye shadowlayer. To render this multi-layered structure,we propose a framework consists of a screen-space translucent rendering approach and theKubelka-Munk model to simulate the makeupappearance. The proposed cosmetic renderingframework deals with two types of layers: cos-metic layers and skin layers. We assume that thecosmetics do not have the pearl and the sparkleeffect.

Figure 1 shows an overview of the proposedframework. To render the human skin with cos-metics, our system first computes the reflectanceand transmittance of cosmetics. Next, the irra-diance that passes through the cosmetic layersis calculated. After obtaining the irradiance ofskin, we apply a screen space subsurface scat-tering method to estimate the outgoing radiance.Then the specular terms of skin and cosmeticsare computed. Finally, the radiance of skin pass-ing through cosmetics is combined with the re-flectance of cosmetics.3.1 Reflectance and Transmittance of

Cosmetics

When the incident light reaches the surface ofcosmetic layers, light is reflected to the air ortransmitted to skin. To model this lighting be-havior, we adopt the Kubelka-Munk model [2]

Figure 1: Overview of our cosmetic renderingframework

to compute the reflectance and the transmittanceof each cosmetic layer, and then use the multi-layered theory to compute final reflectance andtransmittance. The Kubelka-Munk model com-putes reflectance R0 and transmittance T usingthe following equations:

R0 =sinh bSX

a sinh bSX + b cosh bSX(1)

T =b

a sinh bSX + b cosh bSX, (2)

where S andK are scattering and absorption co-efficients per unit length, respectively. X is thethickness of a layer. a = 1+

(KS

). b =

√a2 − 1.

Note that R0 means the reflectance that is notaffected by the beneath background. The scat-tering coefficient S and the absorption coeffi-cient K can be obtained by applying an inver-sion procedure to the measured reflectance. Thedetailed measurement procedures will be dis-cussed in Section 4. R0 and T computed by theKubelka-Munk models are wavelength depen-dent and cannot be directly used for rendering.Hence, we first convert the spectra of the totaltransmittance to CIE XYZ color space, which isthen converted to the RGB color space for ren-dering.

In order to represent different cosmetic stylesfor rendering, we use a cosmetic map to markthe region and thickness of each makeup layer.Therefore, we can fetch the corresponding texel

Page 4: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

Figure 2: Reflectance and transmittance of twonon-homogeneous layers.

to check whether it is covered by a cosmeticor not. Figure 9(a) and Figure 10(a) illustratetwo cosmetic maps. For multiple cosmetic lay-ers, we need to combine different layers to ob-tain the final reflectance and transmittance. Thecombination of reflectance and transmittance oftwo layers involve multiple reflection. Figure 2shows the path of the diffusion radiation for twolayers. The reflectance and transmittance canbe formulated in sum of geometric series, whichcan be calculated by Equation 3 and 4. This for-mulation can be extended to solve the cases thatinvolve more than two layers.

R1,2 = R1 +T1R1T11−R1R2

(3)

T1,2 =T1T2

1−R1R2(4)

3.2 Skin Subsurface Scattering

We adopt a screen-space skin rendering tech-nique proposed by Jimenez et al. [1] to renderthe appearance of human skin. In our work, weconsider whether the surface is covered by thecosmetics or not. If the surface is covered, thereceived energy will be affected by the transmit-tance of the cosmetics. The total energy of lightpassing through the cosmetics is computed andstored in a screen space irradiance map. The ir-radiance map is then blurred several times andcombined. In each blurring pass, alpha blendingis used to accumulate the blurring result, wherethe blending weights are set according to thesum-of-Gaussian.

3.3 Specularity of Skin and Cosmetics

The specular term is the most important for de-scribing the highlight effect. In our observa-tions, if we apply cosmetics to human skin,

Figure 3: The rendering result with specular-ity. Left: only skin specularity added;Middle: only foundation specularityadded; Right: both skin and founda-tion specularity added.

cosmetics usually make the skin surface looksmoother. In particular, when foundation is ap-plied, fine wrinkles on the skin surface are filledwith the foundation, which changes the micro-facet of skin and makes surface smoother. Un-fortunately, the Kubelka-Munk model does notconsider the specular term. Dorsey and Han-rahan [14] proposed a simple way to solve thisproblem. For each layer, they used the Kubelka-Munk model to compute the reflectance andtransmittance and added specular effect by asimple model Cs(

−→N ·−→H )1/r, where Cs is the

specular color, r is the roughness,−→N and

−→H are

the normalized normal vector and half vector,respectively. We use a similar way to simulatethe specularity of each layer. Figure 3 shows theresults of specular effects due to different foun-dations, where the left image only considers thespecularity of skin; the middle image only con-siders the specularity of foundation; the rightimage considers the specularity for both foun-dation and skin layers.

For each layer, we use the BRDF specularmodel introduced by Kelemen et al. [15] to com-pute the specular term. This model is listed inEquation 5.

fr,spec(−→L ,−→V ) = P−→

H(−→H )

F (−→H ·−→L )

2(1 +−→L ·−→V )

, (5)

where−→L is the normalized light direction,

−→V is

the normalized view direction,−→H is the normal-

ized half vector, F (−→H ·−→L ) is the Fresnel term

and P−→H(−→H ) is Beckmann distribution function.

For the Fresnel term, we use Schlick’s approxi-mation to simulate the non-metal materials. Thespecular computation is performed from the bot-tom layer to the top layer. The remained spec-ular lighting is reduced by the transmittance of

Page 5: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

next layer, then combined with the new specu-lar effect. In our implementation, we can notmeasure the roughness and the index of refrac-tion accurately. Therefore, we simply adjust theroughness to control the highlight and set a con-stant value. We believe that if we can obtainthese parameters, more realistic results can beachieved.

4 Measurement of Cosmetics

To compute the reflectance and transmittance ofcosmetics layers, we need the scattering coeffi-cient S and the absorption coefficient K. Wemeasured these properties of cosmetics accord-ing to the Kubelka-Munk model. S and K canbe evaluated by Equations 6 and 7, respectively.Both S and K are wavelength dependent.

S =1

bD

[coth−1

a−Rb− coth−1

a−Rg

b

](6)

K = S(a− 1), (7)

where a = 12(

1R∞

+ R∞), b =√a2 − 1, and

D is the thickness of sample. R∞ denotes thereflectance of the layer when it is thick enoughto be opaque. R represents the reflectance ofthe thin layer painted on background, and Rg

is the reflectance of the background. Note thatthe reflectance R differs from R0 in Equation 1.R in Equation 6 contains the influence of back-ground beneath the cosmetic layer while the R0

in Equation 1 is the reflectance of the layer only.Solving for both the scattering coefficient S

and the absorption coefficient K requires corre-sponding R∞, R, and Rg. We use a spectrora-diometer (MINOLTA CS-2000) to measure liq-uid foundation, cream blush, cream eye-shadowand lipstick. A D55 light source is used andmeasurement is done for wavelength between380 nm and 780 nm with the interval of 5nm.Figure 4 shows our measurement device and en-vironment.

The spectroradiometer measures the radiantenergy coming from the light and being reflectedby the material. As the received radiant energyis influenced by the spectrum of light source,we need to convert the radiant energy to the re-flectance, which can be expressed as

R(λ) =Em(λ)

EL(λ). (8)

Figure 4: Our capture device and environment.

Figure 5: Liquid foundation samples. Left: thebackground. Middle: thick layer.Right: thin layer.

Em(λ) is radiant energy of light that hits the sur-face and reflects to the spectroradiometer. R(λ)is the reflectance of the material and EL(λ) isthe radiant energy of the incident light. We usebarium sulfate of purity 99% as the standardwhite material to measure radiant energy of lightsource EL. The reflectance of the measured ma-terial can be computed by Equation 8. After thisprocess, we obtain the reflectance of each mate-rial R(λ).

In our measurement experiments, we paint acosmetic sample with enough thickness on theblack background to make it opaque and mea-sure its reflectance to get R∞. Then, we paintthe sample on a black background with certainthickness to measureR. Finally, we measure thereflectance of the background to determine Rg.Figure 5 shows a liquid foundation sample. Wealso make a simple container to load samples asshown in Figure 6. By using this container, wecan easily make a flat sample and measure thethickness of the sample. Figure 7 shows the re-flectance of R∞, R, and Rg of three differentcosmetics.

Page 6: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

(a) (b)

Figure 6: (a) The container. (b) Loaded with liq-uid foundation. The depth of shallowand deep containers are 0.15 mm and2 mm, respectively.

We apply R∞, R, and Rg to Equations 6 and7 to obtain the scattering coefficient S and theabsorption coefficient K, respectively. The Sand K of each wavelength is then used in Equa-tions 1 and 2 to simulate the reflectance R0 andtransmittance T of a thin layer. We can also usedifferent thickness D to describe cosmetic ma-terials of different thickness.

5 Results

We implemented our cosmetic rendering sys-tem in DirectX 10 and HLSL. We rendered themakeup effects on two human facial models, in-cluding Digital Coco and Digital Orange. Thegeometry and appearance of both models werecaptured using the Light Stage X, including highresolution meshes, normal maps, and diffuse re-flectance maps. All rendering results were gen-erated on a PC equipped with Intel Core i5 7502.67GHz CPU, 4GB RAM, and an NVIDIAGTX 260 graphics card. The resolution of framebuffer is 1280 × 720. The rendering frame rateis about 170 to 175 frames per second for allcosmetic rendering results.

We measured the optical properties of differ-ent cosmetics, including five liquid foundations,two cream blushes and seven lipsticks. Thebrands of these cosmetics are listed in Table A ofthe supplemental material. We first acquired thereflectance spectrum of these cosmetics. Figure7 shows some of the measured results. Next, weused Equations 6 and 7 to compute the absorp-tion coefficient K and scattering coefficient S.Finally, we reconstructed the reflectance R0 andtransmittance T using Equation 1 and 2. Fig-ure 8 depicts K, S, and reflectance R0 of thesecosmetics in Figure 7. Once the optical parame-

(a) Foundation 1

(b) Blash 1

(c) Lipstick 1

Figure 7: The spectrum reflectance of some of themeasured cosmetics. X-axis and Y-axisare the wavelength in nm and reflectancevalue, respectively. Red: reflectance of a0.15-mm thin layer (with background) R;Green: reflectance of a thick layer R∞;Blue: reflectance of background Rg .

ters of cosmetics are obtained, we can use oursystem to render human skin with cosmetics.Our system allows the user to change the cos-metic material and simulates its makeup effectsin real time. The rendering result of each mea-sured cosmetic can be found in Figures A-C ofthe supplemental material.

Makeup usually combines layers of differentcosmetics. Our method is able to render theskin with multilayered cosmetics. Figures 9 and10 show our results of multilayered cosmetics.In particular, the cheeks are influenced by bothfoundation and blush. Our system allows user tomanipulate multiple cosmetics intuitively. Fig-ure 11 shows a comparison of our result withthe photo of a real makeup. We apply the hybridnormal map method proposed by Ma et al. [16]to generate the reference images. Their method

Page 7: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

(a) Foundation 1

(b) Blush 1

(c) Lipstick 1

Figure 8: The obtained parameters of some of mea-sured cosmetics. Red: absorption coeffi-cient K; Green: reflectance of thin layer(without background) R0; Blue: scatter-ing coefficient S.

(a) (b) (c)

(d) (e) (f)

Figure 9: Makeup rendering of Digital Coco formultilayered cosmetics. (a) Cosmeticmap. Red: foundation area. Blue: lip-stick. Yellow: blush + foundation. (b)The appearance of skin. (c) Foundation1 + Lipstick 1 + Blush 1. (d) Foundation5 + Lipstick 1 + Blush 1. (e) Foundation5 + Lipstick 3 + Blush 2. (f) Foundation 3+ Lipstick 3.

can reproduce the surface shading of the objectbeing captured since it acquires the specular anddiffuse normal maps by polarized illuminationin Light Stage. As we do not have the exactcosmetic map of the real makeup, our result isslightly different from the real makeup. Never-theless, one can still observe that our renderingresult provides a very similar appearance to thereference image.

Limitations. In the current approach, we donot consider the cosmetics that have pearl orsparkle effects. As these effects are view depen-dent, it is difficult to simulate them by simplyusing the Kubelka-Munk model. These effects,however, are also important for makeup simula-tion. Besides, we do not consider the cosmeticsin the sparse powder form. This may not be a bigproblem though as it is not easy to distinguishthe differences among liquid, cream and sparsepowder when the makeup is smeared evenly.Finally, compared to the work by Donner andJensen [9], our approach ignores the subsurfacescattering effects of cosmetics. This works well

Page 8: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

(a) (b)

(c) (d)

Figure 10: Makeup rendering of Digital Orange formultilayered cosmetics. (a) Cosmeticmap. (b) The appearance of skin. (c)Foundation 1 + Lipstick 7 + Blush 1. (d)Foundation 3 + Lipstick 6 + Blush 1.

Figure 11: Comparison with the reference image ofreal makeup. Left: reference image;Right: our result.

for some cosmetics, but not all cosmetics, espe-cially for those kind with high translucency.

6 Conclusion

Our approach provides a simple way to realisti-cally render cosmetics on human skins by com-bining a screen-space skin rendering methodwith Kubelka-Munk model. Different from ex-isting makeup transfer methods that warp anexample makeup image to a user’s facial im-age, our makeup simulation system is physi-cally based and provides flexible user controlon cosmetics thickness and styles in real time.Our system can be used in cosmetic industry toprovide an interactive preview system that al-lows customers and designers to quickly simu-late makeup effects. We believe our work wouldgreatly benefit many related applications in thecosmetic industry.

In the future, we would like to simulatemore complex makeup effects that are view-dependent and handle cosmetics in powderforms. Also, we currently use some auxiliary2D tools to help users design the cosmetic map.In the future, we can employ 3D mesh paint-ing tools to allow users to paint makeup directlyon a their facial models. Another direction is tocombine the cosmetic map with cosmetic trans-ferring method. The makeup styles can be ex-tracted from images and represented as cosmeticmaps, which then can be used in the proposedsystem to simulate the makeup effects in differ-ent styles.

7 Acknowledgement

This work was supported in part by the Tai-wan National Science Council (101-2221-E-009-156-MY2, 101-2628-E-009-021-MY3) andthe UST-UCSD International Center of Excel-lence in Advanced Bioengineering sponsored bythe Taiwan National Science Council I-RiCEProgram under Grant Number: NSC-101-2911-I-009-101. We would like to thank Next MediaCO., LTD. for the support of Light Stage and thecaptured human models and Taiwan IndustrialTechnology Research Institute for the support ofspectroradiometer and cosmetic samples.

Page 9: Physically-based Cosmetic Renderingwclin/cosmetics/Cosmetic... · 2013. 6. 7. · Physically-based Cosmetic Rendering Cheng-Guo Huang, Tsung-Shian Huang, Wen-Chieh Lin, and Jung-Hong

References[1] Jorge Jimenez, Veronica Sundstedt, and Diego

Gutierrez. Screen-space perceptual renderingof human skin. ACM Transactions on AppliedPerception, 6(4):23:1–23:15, 2009.

[2] Gustav Kortum and James E Lohr.Reflectance spectroscopy: principles, methods,applications. Springer New York, 1969.

[3] Wai-Shun Tong, Chi-Keung Tang, Michael SBrown, and Ying-Qing Xu. Example-basedcosmetic transfer. In Computer Graphicsand Applications, 2007. PG’07. 15th PacificConference on, pages 211–218. IEEE, 2007.

[4] Dong Guo and Terence Sim. Digital facemakeup by example. In Computer Vision andPattern Recognition, 2009. CVPR 2009. IEEEConference on, pages 73–79. IEEE, 2009.

[5] Kristina Scherbaum, Tobias Ritschel, MatthiasHullin, Thorsten Thormahlen, Volker Blanz,and Hans-Peter Seidel. Computer-suggestedfacial makeup. In Computer Graphics Forum,volume 30, pages 485–492. Wiley Online Li-brary, 2011.

[6] Yusuke Moriuchi, Shoji Tominaga, andTakahiko Horiuchi. Precise analysis of spectralreflectance properties of cosmetic foundation.In Image Analysis, pages 138–148. Springer,2009.

[7] M Doi, R Ohtsuki, and S Tominaga. Spec-tral estimation of skin color with foundationmakeup. In Image Analysis, pages 95–104.Springer, 2005.

[8] Henrik Wann Jensen, Stephen R Marschner,Marc Levoy, and Pat Hanrahan. A practi-cal model for subsurface light transport. InProceedings of the 28th annual conference onComputer graphics and interactive techniques,pages 511–518. ACM, 2001.

[9] Craig Donner and Henrik Wann Jensen. Lightdiffusion in multi-layered translucent materi-als. In ACM Transactions on Graphics (TOG),volume 24, pages 1032–1039. ACM, 2005.

[10] Eugene d’Eon, David Luebke, and Eric En-derton. Efficient rendering of human skin.In Proceedings of the 18th Eurographicsconference on Rendering Techniques, pages147–157. Eurographics Association, 2007.

[11] Aravind Krishnaswamy and Gladimir V.G.Baranoski. A biophysically-based spectralmodel of light inteaction with human skin.Computer Graphics Forum, 23(3):331–340,2004.

[12] Craig Donner, Tim Weyrich, Eugened’Eon, Ravi Ramamoorthi, and SzymonRusinkiewicz. A layered, heterogeneousreflectance model for acquiring and renderinghuman skin. In ACM Transactions on Graphics(TOG), volume 27, page 140. ACM, 2008.

[13] Abhijeet Ghosh, Tim Hawkins, Pieter Peers,Sune Frederiksen, and Paul Debevec. Practi-cal modeling and acquisition of layered facialreflectance. In ACM Transactions on Graphics(TOG), volume 27, page 139. ACM, 2008.

[14] Julie Dorsey and Pat Hanrahan. Modeling andrendering of metallic patinas. In Proceedingsof the 23rd annual conference on Computergraphics and interactive techniques, pages 387–396. ACM, 1996.

[15] Csaba Kelemen and Laszlo Szirmay-Kalos.A microfacet based coupled specular-mattebrdf model with importance sampling. InEurographics Short Presentations, volume 25,page 34, 2001.

[16] Wan-Chun Ma, Tim Hawkins, Pieter Peers,Charles-Felix Chabert, Malte Weiss, and PaulDebevec. Rapid acquisition of specular anddiffuse normal maps from polarized sphericalgradient illumination. In Proceedings of the18th Eurographics conference on RenderingTechniques, pages 183–194. Eurographics As-sociation, 2007.