Physically based cosmetic rendering

Cheng Guo Huang, Tsung Shian Huang, Wen-Chieh Lin*, Jung-Hong Chuang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Scopus citations

Abstract

Simulating realistic makeup effects is one of the important research issues in the 3D facial animation and cosmetic industry. Existing approaches based on image processing techniques, such as warping and blending, have been mostly applied to transfer one's makeup to another's. Although these approaches are intuitive and need only makeup images, they have some drawbacks, for example, distorted shapes and fixed viewing and lighting conditions. In this paper, we propose an integrated approach, which combines the Kubelka-Munk model and a screen-space skin rendering approach, to simulate 3D makeup effects. The Kubelka-Munk model is used to compute total transmittance when light passes through cosmetic layers, whereas the screen-space translucent rendering approach simulates the subsurface scattering effects inside human skin. The parameters of Kubelka-Munk model are obtained by measuring the optical properties of different cosmetic materials, such as foundations, blushes, and lipsticks. Our results demonstrate that the proposed approach is able to render realistic cosmetic effects on human facial models, and different cosmetic materials and styles can be flexibly applied and simulated in real time.

Original languageEnglish
Pages (from-to)275-283
Number of pages9
JournalComputer Animation and Virtual Worlds
Volume24
Issue number3-4
DOIs
StatePublished - 1 May 2013

Keywords

  • cosmetic rendering
  • skin rendering
  • translucent rendering

Fingerprint

Dive into the research topics of 'Physically based cosmetic rendering'. Together they form a unique fingerprint.

Cite this