-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Giebenhain_2023_CVPR, author = {Giebenhain, Simon and Kirschstein, Tobias and Georgopoulos, Markos and R\"unz, Martin and Agapito, Lourdes and Nie{\ss}ner, Matthias}, title = {Learning Neural Parametric Head Models}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, month = {June}, year = {2023}, pages = {21003-21012} }
Learning Neural Parametric Head Models
Abstract
We propose a novel 3D morphable model for complete human heads based on hybrid neural fields. At the core of our model lies a neural parametric representation that disentangles identity and expressions in disjoint latent spaces. To this end, we capture a person's identity in a canonical space as a signed distance field (SDF), and model facial expressions with a neural deformation field. In addition, our representation achieves high-fidelity local detail by introducing an ensemble of local fields centered around facial anchor points. To facilitate generalization, we train our model on a newly-captured dataset of over 3700 head scans from 203 different identities using a custom high-end 3D scanning setup. Our dataset significantly exceeds comparable existing datasets, both with respect to quality and completeness of geometry, averaging around 3.5M mesh faces per scan. Finally, we demonstrate that our approach outperforms state-of-the-art methods in terms of fitting error and reconstruction quality.
Related Material