The annual Game Developer's Conference brings over 26,000 programmers, artists, producers, and game designers together for five days in San Francisco, California. Vicon exhibited at the 2018 Expo and had one of our most successful shows ever. Take a closer look at our 'Meet Siren' and 'VEX' demonstrations and watch the livestreams below!
Siren with Epic Games, Cubic Motion, 3Lateral Studios and Tencent
Siren, a high-fidelity full performance character driven in real-time, was developed to run on the Unreal Engine using Vicon cameras for capture and processing in Shōgun 1.2.
Epic Games handled all of the modelling, rigging, and rendering look development for Siren, except her face. Their facial scanning and rigging pipeline using 3Lateral's rig was able to accurately capture every dynamic muscle contraction. The rig was capable of conveying an incredible range of emotion in realtime using their Rig Logic technology.
Cubic Motion used the video footage provided by their two camera head mounted camera system to re-create the live actress' facial expressions in real time. They enable Siren to replicate any movement from an actor's performance in realtime at 60fps.
The actress’s marker data is solved directly onto the Siren rig using a custom skeleton solve. This new feature in Shōgun 1.2 allows the motion capture data to stay true to her performance and removes the need for an extra re-targeting step. Our actress wears a skin-tight Lycra suit, with 59 reflective markers attached onto the suit. The Vicon cameras can then track the markers and Vicon Shōgun labels and fits a skeleton to the marker cloud.
One of the most incredible things about the Siren character is that she runs in real-time at 60 fps. For the Siren demo we use our new high-speed Vantage 5-megapixel cameras. These have a wide lens option with very low latency which is critical when running a real-time demo. Running timecode allows us to sync up the body, face and audio which is also crucial when working with multiple hardware sources. Each part of Siren is run by its own PC. The hands and body are run on one, while the facial solver is run on another. All of that data streams into to the main PC, which is running Unreal Engine 4 and acting as the central hub to render the final image.
Digital characters are becoming more and more important as new forms of media emerge. Eventually we may all have digital likenesses of ourselves and having compelling characters that can emote and that we can react to is the next step in creating truly believable characters. This demo shows what is possible.
Watch the "Meet Siren" at GDC livestream here.
VEX Virtual Escape Room with DesignImage, Dynamixyz, HP and PopcornFx
To highlight the new Shōgun 1.2 and give a practical look at what it can do, we created VEX, an interactive multiplayer VR escape room made specifically for GDC to showcase the possibilities of location-based VR. The VEX escape room features puzzles and a futuristic setting created by DesignGame, with real-time facial tracking courtesy of Dynamixyz. Popcorn FX also added its real-time FX to bring the environment to life, and the entire experience is then driven through Unreal Engine 4 using Shōgun and Live Link to animate the players’ avatars in real time.
Players were given the opportunity to step into our Vantage camera volume and see themselves appear as an avatar in the VEX scene, guided by a performer captured in full-body and face animation.
Watch the VEX livestream here.
We want to extend a huge thank you to all of our partners for helping to make GDC 2018 such a success. Siren and VEX will be featured at the FMX 2018 Vicon booth in Stuttgart, Germany from April 24-27th.