Unreal Engine Metahuman with Faceware Mark IV HMC, Xsens, Manus & GlassboxTech

If anyone has used Metahumans and facial motion, you understand it is challenging to capture dialogue. A short clip showing my Metahuman speaking in Greek. This is one of the first few tests I did in Unreal Engine using the Faceware Mark IV HMC to record facial motion, with Xsens and MANUS™ to capture body and finger data, simultaneously. I used the Glassbox Technologies Live Client plugin to stream in the facial motion into Unreal from Faceware Studio. All of this is being powered by a custom Puget Systems virtual production workstation coming equipped with the NVIDIA RTXA6000. I am so grateful to be able to utilize all of this incredible technology in order to create in Unreal Engine. I will be releasing a series of tutorials very soon, walking you through my entire process from capturing facial motion with Metahumans to creating cinematics using a virtual camera tool, Dragonfly. Metahumans from Epic Games and 3Lateral. Thank you to Daniel Rodriguez Cadena for your incredible texture work on the fe
Back to Top