Home
3D Modeling of Face and Facial Expressions

Advances in computer vision and online face-to-face communication promise to rapidly reshape mental health research, allowing for exquisitely rich characterization of behavior from videos collected in individuals’ everyday environments. Unfortunately, the great promise of computer vision techniques is hampered by technical artifacts introduced by uncontrolled environments and variation in data collection devices (e.g., personal phones).

We develop methods that are robust to various technical factors, such as illumination, pose, motion to precisely capture social behavior across uncontrolled conditions. Our methods can decompose a facial image into multiple factors including personal identity, illumination, facial expression, and pose. This allows us to measure facial expressions in a pose-independent manner, as well as to quantify and simulate illumination conditions with high accuracy.


Qualitative illustration of our method’s performance on the AFLW2000-3D dataset. Top row: input images; middle row: 2D landmarks estimated by our method; bottom row: dense 3D shape estimated by our method. It is notable that our method can successfully operate in such uncontrolled conditions, even though we use a 3DMM collected from controlled data, namely Basel 2017.
Cumulative error distribution (CED) of compared methods on the Synthesized dataset for the tasks of 2D and 3D landmark estimation. Performance is reported separately for L=68 and L=51 landmarks.

Source Codes
3D face reconstrution
Assessing separability of facial pose and expression with weak perspective camera model

Publications
Sariyanidi E., Zampella C. J., Schultz R. T., Tunç B., Inequality-Constrained and Robust 3D Face Model Fitting, European Conference on Computer Vision, Accepted for publication, 2020
Sariyanidi E., Zampella C. J., Schultz R. T., Tunç B., Can Facial Pose and Expression Be Separated with Weak Perspective Camera?, IEEE Conference on Computer Vision and Pattern Recognition, Accepted for publication, 2020

Home