SNOOP DOGG, LARRY DAVID, NFTS, AND AN UNLIKELY METAVERSE PROJECT
ASTRO PROJECT IS AN INDIE START-UP TAKING BIG SWINGS
The Crip Ya Enthusiasm music video sounds improbable: a gamified, Web3 metaverse project built in a video games engine, starring Snoop Dogg blended with Larry David, featuring appearances from long-dead rappers Tupac Shakur and The Notorious B.I.G., all wrapped around an NFT component and powered by motion capture.
While the video is high profile and very complex, it’s not the product of a large studio or well established VFX house. It’s the work of Astro Project, a small start-up formed less than a year ago by a pair of founders who had never used motion capture before.
The studio is the brainchild of director James DeFina and YouTuber Jesse Wellens. The pair began collaborating several years ago, but it was when they became interested in Web3, the metaverse and digital production, that their partnership coalesced into a company.
Wellens wanted to do a metaverse project using animation. He showed DeFina the videos Matt Workman made for his Cinematography Database channel documenting his journey learning to use a Vicon system, and DeFina quickly saw the possibilities.
“I thought, that’s amazing. It’s the future. And basically, we were like, OK, let’s start a company doing all this,” says DeFina.
From there, investing in a Vicon system was the obvious choice. “I saw Matt’s work with Vicon and I did a lot of research, and I liked how the stuff looked. I wanted to recreate that.”
James’ and Jesse’s learning curve got a lot steeper when one of the most famous rappers in the world got in touch.
We didn’t really even know the full scope of what we were getting into. We just said, let’s go for it,” says DeFina. “In a month it really just happened. And suddenly we were asking, okay, how much is a Vicon system? Where do we get it?”
At the time, however, supply chain issues meant there was a six month wait for the system they wanted to buy, and they needed to begin work almost immediately. Jeffrey Ovadya, Director of Sales & Marketing for Vicon, stepped in. “He was really helpful. He was even saying, you can borrow a system until yours is ready,” says DeFina.
James got set up with 16 Vero 2.2 cameras, a Lock Studio and a video camera, all running through Vicon’s Shōgun VFX software into Unreal Engine. Procuring a system was only half the battle, though. James had to learn motion capture very quickly.
“The training was great, which was awesome. Everybody at Vicon has been very helpful. Because I probably hit them up like a million times about everything, because I was just learning everything so fast.”
An unusual aspect of the project was how DeFina and Wellens found a way to pay that learning forward using NFTs. The first wave of NFTs were predominantly digital collectibles, but those being offered to Astro Project’s backers also have a much more practical function.
“We created digital keys that would give people access to download some of the environments, and then some of the characters and some of the mocap data,” says DeFina. “We’re letting people download the data and teaching them how we’re doing everything, so they can learn and practice on an actual project. They might not have a motion capture system, but they can use some of our mocap data from the Vicon system and then see how it’s done in Unreal Engine.
The form that the keys take is a tongue-in-cheek nod to digital culture – a virtual donut. “We decided to do these digital donuts because in 3D design, one of the first things people ever learn is how to make donuts in Blender,” says DeFina.
ONE DAY TO SHOOT
While Astro Projects is inviting fans to learn from its work, DeFina is well aware that he’s still a relative beginner himself. Capturing Snoop Dogg, in particular, was a learning experience. “The day of the shoot, I was on a zoom call with people from Vicon and a couple of our other technology partners, and we’re all trying to figure out how they can work together simultaneously. It was one of our first shoots ever, and we had Snoop coming in for one day, and we had to just go for it,” he says.
Fortunately, the creative direction only required a light touch. “We kind of let him do his thing. Obviously the character’s half Snoop, half Larry David. Since he was looking like Larry David a lot, we kind of just wanted him to move like Snoop Dogg, right? Like his mannerisms, his dancing and all that, so that people know it’s him.”
And for all the venture’s challenges, it was a success. “When we showed him the final project, he hugged us. He loved it. He definitely wants to do more.”
A MORE DECENTRALIZED METAVERSE
Now that DeFina and Jesse have their system set up and understand how to use it, they want to experiment with further boundary-pushing content. “One thing we really want to do is make hyper-realistic animated content with custom Epic metahumans and Unreal Engine,” says DeFina. “But then, we also want to do minigame experiences that can go along with a piece of content, like AAA minigames.”
James is also interested in using his Vicon system and Unreal Engine not for filming things he couldn’t shoot in real life, but simply for the sake of convenience. “It’s almost a one-to-one comparison, right? For example, with a virtual set I can light something just how I light it in real life without needing this whole crew.”
James and Jesse have also been experimenting with VTubing and creating a virtual podcast, streaming motion-captured characters in real time. “We’ve been testing some new tech that could allow us to do live streaming in two different locations at the same time with two characters. So, we can have somebody in our studio, and then somebody who’s physically in Canada, at the same time in the same virtual environment,” says DeFina. Looking ahead, DeFina sees a future for all this metaverse content that isn’t only about the big operators such as Epic, Meta and Roblox.
“It’s always been a thing with these big gaming companies: they were the only ones doing it,” he says. “And I think that now, more and more creators are going to get into it. There’s going to be a lot of smaller studios and startups using motion capture animation for the metaverse. It also leads to a lot more collaboration between people. And as technology gets better and better, things are going to become easier for people. There are a lot of people building solutions inside of Unreal Engine to make everything simpler. I think it’s just going to get easier and easier.”
Watch ‘crip ya enthusiasm’:
Motion capture buyer's guide for Life Sciences
If you want to learn more about motion capture and understand if it’s right for you, then this guide will ease you through the process of getting started with motion capture.
You will understand the technologies, identify which is for you and learn the questions to ask to help you source your motion capture system.
We created this guide specifically for researchers, biomechanists, sports scientists and academics in Life Sciences and designed it to address your particular needs.