distributionbta.blogg.se

Realsense faceshift
Realsense faceshift








realsense faceshift

I did not win the top price this day, which was a laptop with an integrated RealSense camera. It took me less than an hour to get some results matching industrial standards! And what a result! The high-quality tracking deliveredīy the camera, paired with Faceshift algorithms met all my expectations. I retargeted the face tracking on a 3D head and voilà! Full face tracking in real-time, impressive! I played along with the camera parameters to get the best result possible. I let you imagine the startled programmers looking at me all around! Once finished, Looking disturbingly like me! I refined it by scanning my face, while performing various expressions (BlendShapes): mouth open, frowning, puffed cheeks. A quick scan of my head transformed a generic head into a 3D model No app for me, since I'm not from the dark side of programming ) But I used the time to install a demo of Faceshift and set-up the camera for it. What this one has to offer! We were given a few hours to develop an prototype application using the RealSense camera features. I tested it some time ago with the first version of the Kinect sensor, and the results were pretty good. Indeed, paired with a motion-capture software likeįaceshift, you can get full facial animation and lipsync without any markers needed, as opposed to an optical system.įaceshift is a very powerful software, but it rely entirely on the camera you are using. Most of all, I was interested in its face-tracking capabilities: eyes, cheeks, shape of lips, expressions, emotions, and so on. I had the opportunity to discover this new cameraĪnd the astonishing features it's offering. As a mo-cap animator, I was invited this week to an Intel® RealSense™ Hands-on Lab in Berlin.










Realsense faceshift