Xsens, Kite & Lightning, IKINEMA, and Unreal Engine to showcase how the iPhone X and Xsens can empower democratized, full-performance motion capture at SIGGRAPH’s Real-Time Live.

Enschede, Netherlands, July 2018 – A new approach to DIY, full-performance motion capture will be showcased at this year’s Real-Time Live at SIGGRAPH 2018.

The session ‘Democratising mocap: real-time full-performance motion capture with an iPhone X, Xsens, IKINEMA, and Unreal Engine,’ will take place on Tuesday 14 August from 6pm-7.45pm at the Vancouver Convention Center’s West Building, Ballroom AB.

Cory Strassburger, co-founder of LA-based cinematic VR studio Kite & Lightning, will demonstrate how an iPhone X, used in tandem with Xsens inertial motion capture technology, can be used for simultaneous full-body and facial performance capture, with the final animated character, live streaming, retargeted and cleaned via IKINEMA LiveAction to Epic Games’ Unreal Engine – all in total real time.

“Thanks to recent technology innovations, we now have the ability to easily generate high-quality full-performance capture data and bring our wild game characters to life – namely the iPhone X’s depth sensor and Apple’s implementation of face tracking, coupled with Xsens and the amazing quality they’ve achieved with their inertial body capture systems. Stream that live into the Unreal Engine via IKINEMA LiveAction and you’ve got yourself a very powerful and portable mocap system – one I’m very excited to show off at SIGGRAPH 2018.”Cory Strassburger, co-founder, Kite & Lightning

Taking a ‘Beby’ character from Kite & Lightning’s upcoming ‘Bebylon’ game, Cory will show on stage how this simple, DIY set up, powered by accessible technology, can power real-time character capture and animation. He will demonstrate how the new approach to motion capture does not rely on the process of applying markers or setting up multiple cameras for a mocap volume, but rather relies only on an Xsens MVN system, a DIY mocap helmet with an iPhone X directed at the user’s face, and IKINEMA LiveAction to stream and retarget (transfer) the motion to ‘Beby’ in Unreal Engine. Via this setup, users can act out a scene wherever they are.

The Real-Time Live! session will also cover the implications and potential of this process on future creative projects, revealing how new scalable workflows can empower games, CG, and animation industries at the indie level without the need for huge budgets.