Real-Time Illumination Estimation from Faces for Coherent Rendering
SCHEDULE INFORMATION
Event Title
Session Title
Chair
Room
Start
End
Reconstruction and Fusion
Reconstruction and Fusion
Walterio Mayol-Cuevas, Bristol University
HS1
11 Sep, 2014 10:00 AM
11 Sep, 2014 12:45 PM
Authors:
Sebastian B. Knorr
Authors:
Daniel Kurz
Abstract:
We present a method for estimating the real-world lighting conditions within
a scene in real-time. The estimation is based on the visual appearance of a
human face in the real scene captured in a single image of a monocular
camera. In hardware setups featuring a user-facing camera, an image of the
user's face can be acquired at any time. The limited range in variations
between different human faces makes it possible to analyze their appearance
offline, and to apply the results to new faces. Our approach uses radiance
transfer functions - learned offline from a dataset of images of faces under
different known illuminations - for particular points on the human face.
Based on these functions, we recover the most plausible real-world lighting
conditions for measured reflections in a face, represented by a function
depending on incident light angle using Spherical Harmonics. The pose of the
camera relative to the face is determined by means of optical tracking, and
virtual 3D content is rendered and overlaid onto the real scene with a fixed
spatial relationship to the face. By applying the estimated lighting
conditions to the rendering of the virtual content, the augmented scene is
shaded coherently with regard to the real and virtual parts of the scene. We
show with different examples under a variety of lighting conditions, that our
approach provides plausible results, which considerably enhance the visual
realism in real-time Augmented Reality applications.