- This event has passed.
Thesis Defence: Biophysically Based Texturing: A Spatially Aware Autoencoder Approach to Interactive Human Skin Rendering
August 1 at 1:00 pm - 5:00 pm
Joel Johnson, supervised by Dr. Kenneth Chau, will defend their thesis titled “Biophysically Based Texturing: A Spatially Aware Autoencoder Approach to Interactive Human Skin Rendering” in partial fulfillment of the requirements for the degree of Master of Applied Science in Electrical Engineering.
An abstract for Joel Johnson’s thesis is included below.
Defences are open to all members of the campus community as well as the general public. Please email kenneth.chau@ubc.ca to receive the Zoom link for this defence.
ABSTRACT
The goal of this work is to produce a biologically and physically accurate model for the relationship between skin structure and color that is practical for real-time rendering and editing of skin appearance. As virtual and augmented realities become more prevalent, the demand for realistic skin rendering grows, necessitating a model that not only captures the subtle nuances of skin tones and textures but also allows for dynamic modifications in real-time. To establish an accurate baseline, Monte Carlo light transport simulations are utilized to generate a detailed tabular dataset, encompassing skin chromophore amounts, thicknesses, reflectance distribution, and corresponding RGB color. This dataset serves as a Lookup Table (LUT) for efficient conversion of 2D pixel arrays or power spectral densities into their corresponding biophysical parameter maps through optimized search methods and GPU-based matrix operations. To facilitate a more efficient mapping of the data, autoencoders are used to learn a nuanced mapping between skin parameters and color. We then evaluate the efficacy of both RGB and HSI autoencoders in parameter map generation, image recovery, and image editing, aiming for a more accurate and subtle representation of skin color. To further enhance the model’s capabilities, vision transformers are utilized to translate RGB textures into hyperspectral cubes, employing spatial and spectral multi-headed self- attention mechanisms. In its final stage, the model incorporates the use of dense landmarks to facilitate spa- tially coherent alterations within the latent space, ensuring realistic and accurate transfer of latent biophysical properties between different images of human skin. This novel ap- proach facilitates photo-realistic, real-time skin rendering and enables a biologically based spatial distribution of skin characteristics. Through this comprehensive methodology, the work sets a new standard for skin ren- dering, editing and generation, combining biological accuracy with practical applicability for enhanced realism and inclusivity in digital humans.