Real-time retargeting for lifelike character animation
One thing that the Game Developers Conference drives home every single year is the sheer (and growing) complexity of making video games.
“Every time a new technology comes out to address a challenge, we find new challenges as a result,” David ‘Ed’ Edwards, VFX Product Manager for Vicon, told an audience at a motion capture showcase. “There’s this constant cyclical pattern of working out how we make life easier for people making games. Even though motion capture is just one of a whole series of technologies involved with game development, it’s a critical one. It’s one that’s central to a lot of different pipelines. So we’re constantly asking ourselves: how do we address that issue? How do we make creating games easier, simpler and faster?”
There are many answers to that question, but in recent years many of them have come back to real-time streaming.
“We introduced this to the industry because we didn’t want people to have to wait for processing to complete to see what their data looks like,” said Ed. “The reason we introduced props was because we didn’t want people to just imagine what it would look like. All of these things are done to reduce reliance on the phrase, ‘just imagine’.
“We never really want people to say ‘just imagine what this character will look like’. Because when you say that, you introduce ambiguity, you introduce interpretation, you introduce a degradation of confidence in what they were working towards. So one thing we are constantly aiming for is removing that need to ‘just imagine’. We want to show people what they’re working on at that very moment.
“One of the ways we do that is retargeting.”

Retargeting is the process of mapping a human performer’s skeleton onto the skeleton of a character that doesn’t share their bodily proportions so that they can PCAP (performance capture) that character.
“What this involves is taking each of the performer’s bones and mapping them to the bones of the character we want to drive,” said Ed. “And in some cases, where perhaps extra bones exist [in a tail, for example], we can just switch those off because we don’t want them to be influenced in any way by the performer’s motion.”
The initial work is done in Shōgun Post, but once the retargeting is set up it can be imported into Shōgun Live.
“With a few magic clicks we see our retarget in real-time,” said Ed. “We’re getting highly accurate tracking. We’re getting everything in real-time. We don’t need to reprocess it. We see our character as they are designed, in the here-and-now.”
Real-time performance capture solving game animation challenges
While retargeting is a technically impressive feature, its purpose is creative.
“This is really important for directors, who can now see their idea coming to life,” said Ed. “You’re not reliant on storyboards, you can see what it looks like static and what it looks like in motion. And that’s really important because again, we are reducing your reliance on imagination. From a performer’s perspective, your actor can now see the character that they’re driving and they can map their performance to that.
“These are all important things that mean that as a team, you can creatively work together to create scenes that are indicative of the final product. Previously, you had to wait for it to be done by the renderfarm, you would have to wait for your FBX to be processed.”
This instant visualization means that changes can be made in the moment, not after-the-fact.
“If there’s something off with the proportions, or we feel that the performance needs to be modified to better suit the characters as they’ve been created, that can be done in the volume,” said Ed. “It’s not something we need to wait on feedback for, three weeks down the line. It can happen in the space. And, of course, we can also record the data, we can review it later, we can play it back.”
This real-time visualization isn’t only available as pre-visualization in Shōgun Live—it can be streamed straight into Unreal Engine 5, complete with character skins and environments for an even clearer sense of how your animation will look in your final cinematic.
“As you can imagine, you can replay this data with different models, you can bring in a different retarget,” Ed explained. “What it really does is empower creators to make iterative decisions as they go, and do it with confidence as to what the final result will be.”
For more information on how a Vicon motion capture solution can improve your creative output, click here.
Check out the highlights of Ed’s presentation below. Alternatively, you can view the full version along with others from the day here.
FAQ's
What is real-time retargeting in motion capture?
Real-time retargeting in motion capture refers to the process of instantly applying captured movement from an actor to a digital character rig as the data is recorded. Instead of waiting for post-processing, real-time retargeting streams motion capture data directly into animation software or game engines, allowing teams to see finalised character motion live.
Real-time retargeting accelerates production feedback, enables creative iteration on the spot, and reduces the gap between performance capture and usable animation assets.
How does real-time performance capture improve animation workflows?
Real-time performance capture improves animation workflows by providing immediate visual feedback of captured motion on a character model. Animators, directors, and performers can assess movement quality live, adjust performance direction instantly, and refine character animation without waiting for offline processing.
This streamlines collaboration, shortens iteration cycles, and reduces the time and cost typically spent on traditional motion capture clean-up and retargeting.
What are the benefits of real-time retargeting for game development?
Real-time retargeting offers several benefits for game development:
- Faster iteration: Teams can preview performances during capture sessions and make adjustments immediately.
- Smoother pipelines: Motion capture feeds directly into engines like Unreal and Unity with minimal latency.
- Better collaboration: Creative and technical teams can review live performances together.
- Reduced overhead: Less time spent on retargeting and manual data cleanup post-capture.
These advantages make real-time retargeting especially valuable in fast-paced game production environments where responsiveness and efficiency are critical.
How does real-time retargeting differ from traditional animation methods?
Real-time retargeting differs from traditional animation methods in speed and feedback. With traditional approaches, captured motion must be processed, cleaned, and applied to characters offline, delaying review and iteration.
In contrast, real-time retargeting applies motion capture data instantly to digital characters, enabling immediate evaluation and creative decision-making. This supports more dynamic collaboration and reduces the time between capture and final animation.
What use cases benefit most from real-time performance capture?
Use cases that benefit most from real-time performance capture include:
- Game cinematics and motion-rich gameplay where performance nuance matters
- Interactive VR/AR experiences requiring immediate character response
- Live performance capture sessions with directors and actors on set
- Rapid prototyping and iterative design where quick feedback is essential
- Multiplayer animation workflows in shared virtual environments
Real-time performance capture helps teams achieve realistic, responsive character motion while accelerating pipeline efficiency — especially in cutting-edge game development projects.