Posted by jfyelle, 12 June 2011 5:00 pm
Martin Breidt and friends at Max Planck Institute for Biological Cybernetics are researching novel ways to retarget facial animation to digital characters.
One of the approaches developed is shown in this video. It makes use of morph targets initially setup in 3ds Max to interpret the observed data. Using a Kinect device as a source of data provides remarkably good results, given the fact that only depth data was used, not the color video. The video shows the robustness of the system and its ability to handle noisy data.
Martin Breidt’s work involved processing lots of 3D scans of faces, writing a lot of MAXScript code and implementing a live link between 3ds Max and Matlab, a technical computing/math/programming environment developed by MathWorks.
I hear they had an encouraging and positive response and are checking up how this could be turned into a product. Give them a shout!
Check http://www.kinecthacks.com/kinect-detects-3d-facial-expressions/ for some more infos.
About Martin Breidt
Martin is a long time user and beta tester of 3ds Max, as such, it was all natural to him to use 3ds Max as a development platform for building this system.He currently occupies a technical director position at Max Planck Institute for Biological Cybernetics. He specializes in 3D computer graphics, animation, realtime rendering, 3D scanning, motion capture and video capture. Check is personnal website
Please only report comments that are spam or abusive.