“Old School/New Cool: Driving Live Engagement through Mixed Mediums in Real-Time” by Lan and Dolan

  • ©Albert Lan and Michael Dolan

Conference:


Type(s):


E-Tech Type(s):



Description:


    The hardware setup used during this process includes a DIY mocap helmet to mount a GoPro or iPhone, and an Xsens inertial mocap suit. The motion capture suit offers an OBR mode (on-body recording) which offers maximum flexibility in terms of settings for filming. For software, the team uses the Full-Body Mocap Animation solution by Reallusion, Motion LIVE, as the heart of the operation. This solution links live mocap data stream, GoPro footage, and the facial capture data (or iPhoneX’s true depth camera feed) through the Live Face app into iClone’s Motion LIVE plugin. iClone Motion Live can then bring together all of the feeds to drive a 3D character for real-time live performance while blending it into the scene. Working with Reallusion’s yet-unreleased Unreal Live Link plug-in, this streams all the data directly into the Unreal Engine in real-time, with RTX Ray tracing rendering. Once a scene or shot has been “assembled” in Unreal, the team use OBS (Open Broadcaster Software) to prepare the live online broadcast to social media platforms such as Facebook Live by gathering all the Media sources  including the scene or characters, live video feed, pre- recorded or processed backgrounds and foregrounds, audio,  V/O, and graphic elements like logos, text, and UIs – creating different scenes according to the storyboard or story sequence that will take place. Once all this is set, the Voodoo Station crew are ready to either record or live broadcast the entire process from back to front real-time simultaneously.


PDF:



Overview Page: