Motion Capture for Games, Film, TV, Digital Content and Location-Based VR: Highlights from SIGGRAPH 2025

 

If you stopped by Vicon’s booth at SIGGRAPH 2025, you might’ve caught Gastón Alvarez showcasing some incredible work made with Vicon’s mocap technology. Gastón is the Virtual Production Supervisor and Aximmetry ambassador for Spaceboy, and in his talk, he walks through how one small team can ship larger-than-life, impressive work across games and film with the right mocap pipeline. 

Meet the team: Spaceboy + Aximmetry + Vicon

Spaceboy is a Mexico-based creative studio (founded in 2015) that blends VFX, games, and motion graphics for films, commercials, games and movies. In Gaston’s session, he explained how the studio leaned into Vicon to move faster than traditional 3D animation alone – capturing natural movement first, then layering artistry on top. He also shared why he advocates for Aximmetry in live and virtual production workflows: it’s compositing software that connects cameras, body tracking, and real-time engines so you can stage and see your ideas immediately.

How mocap needs differ by medium (and how Spaceboy did it)

TV advertising: fast iteration, one performer, many characters

When Spaceboy shot a Lysol commercial, they cast a single performer (dancer/actor Samantha Aguilar) to play 8 different roles, meaning they didn’t need to hire 8 different actors. Using Vicon’s mocap system, they captured the full movement set cleanly within tight timeframes, then reused and remixed takes – saving both time and budget while keeping believable body language across characters. It’s a good template for agencies: one stage day, multiple personas, consistent quality. 

Games: systemic moves, cinematic beats, and long tails

For their first video game, the narrative-driven Hannah, Spaceboy again captured all the movement for the principal player character and all enemies using Vicon, then streamed and retargeted that data into the game pipeline. “We saved a lot of time compared to traditional 3D animation,” notes Alvarez – a gain that carried into commercials, games and shorts. This pipeline translates into consistent character traversal, combat loops, and boss behaviors that “feel right” in-engine, while still leaving room for animators to add style on top. Hannah has already shipped on Xbox and Steam, with versions headed to other platforms soon. 

Film & shorts: performance first, cleanup minimized

In their latest short film, Svan, their workflow emphasized natural performances with minimal cleanup. That approach let them block scenes, like complex dance sequences, and capture them all in a single day, while keeping the spontaneity that audiences notice, even if they can’t name it. As Alvarez recalled: “…we don’t need to spend a lot of time cleaning. In fact, we made all the shots in just one day and the results were amazing. So definitely this is the best system for motion capture that we ever tested before.”

Virtual Production & live demos: put the character in the room

Spaceboy built “Maz-22,” a demonstrator character used to explain mocap to clients: a robot that can dribble, dance, or throw punches live. With Vicon providing the body motion and Aximmetry handling camera compositing, you can place the character in your real world (or a virtual set) and get instant feedback on eyelines, pacing, and interaction. As Gastón put it, “… you can make any kind of creature, or if you want to control a metahuman in real time for virtual production or augmented reality, you can use Vicon systems and obviously, the movements are incredible.”

Why this works: a practical pipeline creators can use

Gastón’s big idea is simple: start with speed, lock detail where it matters. Spaceboy explores quickly (great for previs and blocking), then dials up fidelity for hands, head, or prop interactions when shots are ready to lock. Whether you’re chasing a game milestone or a 30-second commercial, the pattern is the same:

    1. Capture quickly to protect performance and intent.
    2. Retarget live so the team sees results in context.
    3. Refine selectively (hands, head, contacts) so close-ups and interactions read perfectly.

This hybrid mindset pairs nicely with Vicon’s evolving markerless capabilities and proven optical tools – handing you speed and precision instead of forcing a choice. For the latest demos and workflows, check out the SIGGRAPH hub.

Using mocap for AR experiences: a quick rundown

  • Block in minutes. With Vicon driving body motion and Aximmetry doing real-time compositing, you can place a character in a physical space (or virtual set) and assess scale, silhouette, and reactions immediately.
  • Reuse everywhere. The same capture session feeds broadcast inserts, on-site activations, and social content – handy when a campaign needs multiple deliverables on tight timelines.
  • Sell the idea. Live demos help non-technical stakeholders “get it” at a glance, speeding buy-in for creative and budget. Spaceboy’s “Maz-22” is a perfect example.

Keep exploring

Vicon captures the performance, both markered and markerless, while Aximmetry blends that motion with camera tracking and your real‑time engine on LED stages. You can see actors and CG together, judge eyelines and timing on the day, and carry that take straight into edit, broadcast or AR with far less rework. Capture once, review in context, publish everywhere.

Ready to see it in action? Dive into demos, guest sessions and step‑by‑step workflows on Vicon’s SIGGRAPH 2025 page.