Humboldt-Universität zu Berlin - Mathematisch-Naturwissenschaftliche Fakultät - Visual Computing

BMBF Project 3DGIM

3D facial analysis for identifcation and human computer interaction

 

Period: 01. Oktober 2015 - 30. September 2017

 

This project is funded by the BMBF, 03ZZ0407

The 3Dsensation consortium aims at fundamentally redefining human-machine interaction. This includes also the automatic 3D analysis of human faces and facial expressions which is useful in many tasks such as access-control, creation of new dialog systems for human-computer-interaction of even medical therapy. Current methods for a geometric facial expression analysis and synthesis often rely purely on linear models. They represent facial expressions as a linear combination of a small number of basis expressions which are typically learned from a large training set using well known machine learning techniques such as PCA. While this approach is very popular due to its simplicity it also lacks fine details the ability to appropriately cover the space of possible facial expressions. Several techniques have been proposed to circumvent these limitations like for example part based models, adaption of basis expressions, or extension of the expression space via corrective shapes. In contrast to the aforementioned methods, we want to develop a new model based approach that is able to refine the estimated result using a hierarchical approach i.e. a state based facial expression model that adapts its expressiveness/complexity to the current facial expression state. This allows for more detailed deformations without unnecessarily increasing the overall complexity. Finally we combine the model based reconstruction technique with a model free refinement step to ensure highly detailed 3D reconstructions but simultaneously guarantee temporal as well as semantic consistency.

bmbf