When people talk to each other, they do not only use the various facets of language and voice to convey information, but also their facial expressions. The latter are an essential part of our social interaction, and communication in virtual reality (VR) has been lacking this factor entirely as of yet. That is why we explore the possibilities of face tracking technology in this project.


Using RGB-D sensors, we are able to scan a face and to transfer its movements to an avatar. These sensors allow us to capture the depth of the surface. Additionally, we can also use RGB cameras to track the eye movement and transfer them to the avatar. To achieve an even higher level of detail in the virtual representation, the VR glasses also include sensors that recognize eyebrow movement.


Contact: Philipp Ladwig


Acknowledgements:

This project is sponsored by: German Federal Ministry of Education and Research (BMBF) under the project number 16SV8182 and 13FH022IX6. Project names: HIVE-Lab and Interactive body-near production technology 4.0 (German: Interaktive körpernahe Produktionstechnik 4.0 (iKPT4.0)).