Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Creative Bloq
Creative Bloq
Technology
Alicia Haddick

I visited Cygames' massive mocap studio in Japan, and I'm in awe

Cygames mocap studio visit; people in a large mocap studio.

Motion capture technology changed video games. It’s often easy to forget that we’re barely 30 years into 3D gaming as a concept, in which time we’ve evolved from painstakingly rendering simple polygons to building rich, complex worlds in advanced platforms like Unreal Engine 5 and even the prospect of AI game engines.

But there’s more than simply fine detail that makes a game believable. Can we invest in the emotions of its characters? Do we feel what they feel? Can we sense their joy or pain in their actions? Even in large fantastical worlds with unfamiliar creatures, does it all feel real? For that, motion capture has become an indispensable tool in going beyond code and triangles and into something more tangible.

One look at behind-the-scenes making-of videos for blockbuster franchises like Assassin’s Creed and The Last of Us and you’ll encounter footage of people in spandex covered in small tracking balls, and even some smaller titles will take advantage of at least facial tracking in order to enhance the performances of characters in their work.

To discover more on how mocap has and is changing game development, I visited one of the largest motion capture facilities in Asia built by Japanese studio Cygames, which embodies the technology in many of their domestic success stories, with major expansionist ambitions on the global stage.

Cygames was founded in 2011 as a mobile-first studio working on licensed titles like The Idolm@ster and original projects such as Rage of Bahamut with DeNA. Each a success in their own right alongside other titles the company launched during their early days, the studio found their biggest hit to date with the release of Granblue Fantasy, an RPG featuring music from Nobuo Uematsu alongside work from other Final Fantasy veterans.

The game was an ambitious journey to the skies that resonated immediately with audiences and has since spawned its own annual convention, an anime, a console game and much more. Read our Granblue Fantasy: Relink review for a glimpse of the vibrant Cygames anime art style.

Nowadays the company is a multimedia giant, one of the biggest in the industry with over 3,000 employees across its core gaming studios in Tokyo, Saga and Osaka, all while simultaneously running an anime production studio in Cygames Pictures, a manga production team, eSports and more entirely in-house.

(Image credit: Cygames)

Notably, as a media production studio first and foremost, motion capture is key to Cygames' operations and the identity of many of their IP. While their Tokyo studio is dedicated to their mobile games, the Osaka studio was founded with the ambition of being the company’s in-house team for taking their ideas to consoles using state-of-the-art technology.

The mobile Uma Musume: Pretty Derby (releasing in English in 2025 after a successful Japan and Asian debut), is inspired by horse racing with the concept that racehorses are reborn as horse girls that will race and train to be the best, and serves as a major example of the multimedia use cases for the technology in-house across a single franchise. Horses are like music idols and superstars in fame in this world, with winners even performing concerts post-race to original music.

The game on phones is rendered in 3D to bring a sense of realism to training, races, as well as the post-victory idol musical performances with intense and detailed choreography. This is used within the anime production also. Granblue Fantasy Relink, the company’s first in-house major console project, used it to bring the kinetic and intense combat abilities the team imagined in their heads to life, and is key to the game’s success in making its combat feel so satisfying in the heat of the moment.

Every project Cygames works on relies on the technology, so no wonder the company has not one, but two motion capture studios in-house in both Tokyo and Osaka.

Not only that, the Osaka capture studio is a relatively recent opening for the studio in 2022, one the company boasts as being one of the largest in the world for its capabilities. I visit both to come to terms with just how the team uses the technology, and upon visiting the expansive Osaka studio for the first time, a mix of surprise and awe was the first reaction that left me as I slowly walked inside.

To be honest, I hadn’t been sure what to expect - I’ve seen motion capture studios plenty of times before on video, but not in-person. That being said, witnessing the work that goes into running a real motion capture studio, particularly one of this scale, was both more minimalist and technically-stunning than anything I had first envisioned.

The room is evenly lit with harsh LED lights, all for the purpose of ensuring optimal brightness for the 168 cameras across multiple tiers of scaffolding around the perimeter of this 14m x 10m x 8m room to capture every angle of the motion capture actor’s performance.

A large projector hangs to project locations for setting or a virtualised guidance map to replace the painstaking process of applying and ripping away guidance tape that could assist actors in knowing where to go next. There are even hooks at each end from which actors can be attached via retractable (or bungee) cords for more action-heavy capture.

(Image credit: Cygames)

A long curtain across one wall hides large mirrors reminiscent of those seen in dance studios, used to help actors prepare before recording a scene. Just off to the back of the capture area is a carved-out hole-in-the-wall filled with screens and computers for those overseeing the capture to assess the footage in real-time using Unity (with results simultaneously beamed onto larger screens attached to one wall above the capture stage).

Although the team uses its own proprietary in-house engine for many of their gaming projects, Unity is used at this step of the process for its support for common motion capture technology available on the market, enabling the team to make on-the-fly adjustments before handing off the capture to the core development team.

Both capture studios in Osaka and Tokyo are purpose-built for the company’s specific blend of media production. In Osaka, with the studio specialises in console games development, the greater space gives them the ability to pull off flashy action sequences with full motion capture, something used to great effect in Granblue Fantasy Relink’s flashy action and sword skills. (Read our interview with Granblue Fantasy Relink art director for more detail on this game's production design.)

For the company’s mobile games, especially those like Uma Musume where idol-like dance performances dominate, the Tokyo studio is capable of capturing over a dozen performers at any one time with practice space for dancers to rehearse choreography if needed.

(Image credit: Cygames)

Beyond its own capture studio, Cygames also internally produces its own mocap suits and velcro markers, refining them over the years to better track performers while being more comfortable for a wide range of motion. Indeed, much of the purpose of this development was to ensure the suit was not an impediment in their goal, even when using complex props.

Indeed, the team also keeps all the tools needed to build and recreate weapons, armour or environmental pieces needed for a scene on-site. Beyond a storage of common items, all the crafting tools and wood needed to build ramps, stairs and other simple environments are on-hand for the industrial team in Tokyo.

In Osaka, the team even has access to a 3D printer alongside projection mapping, used to quickly print weapons using the in-game mesh designed by the team as a 1:1 replica, making it easier for actors to immerse into a role with an exact recreation of the weapon their character should be holding at that moment. (Read our guide to the best 3D printers.)

Seeing all this in action is impressive. To demonstrate the studio in Osaka, the team projects simple test courses onto the floor that the actor can follow, letting them run through before yanking them back on the bungee cord. All this is captured effortlessly in real time on the screens far above me.

Next door is an audio foley studio and prop studio with the development team just a few rooms away, in real recording it would be easily possible for the team to jump in to make adjustments without losing time and resources to rearranging a new session to make the corrections they need.

(Image credit: Cygames)

In the cut-throat world of live-service and mobile gaming this becomes more important. Weekly and biweekly updates for mobile games are not only the norm, there’s an expectation that these will be delivered promptly without any grace period offered to developers if something comes up during the creative process. Anecdotally I'm told Tokyo capture studio made it easy for developers to make last-minute requests that could be actioned on that same day.

If the team needs a new facial animation for a cutscene or a new pose, an in-house mocap studio ensures they’re only a few floors and a quick request away from recording what's needed. If no actor is available, they can even become an anime girl and do it themselves. That’s right: that cute anime horse girl Gold Ship running and waving at the screen? The team don't tell me exact times this scenario occurred but who knows, it could actually be some dude on the development or motion capture team recreating the pose in the heat of the moment to ensure an update is ready to ship.

Motion capture was first used in games to bring greater realism to 'cinematic' hits like Uncharted and Sony's early foray with Jet Li: Rise to Honor as well as work by developer Ninja Theory that has always sort inspiration outside of gaming to create its titles (read our interview on the making of Senua's Saga: Hellblade 2. Such cases reveal how developments in mocap tech have helped propel these games to the top of the industry. It has also gone the other way, with the film industry adopting Unreal Engine and game tech for its purposes (read how Amazon's Fallout TV show made use of UE5).

Yet we now see games thinking about motion capture beyond the simple desire for realism and more to assist in the creation of complex animated sequences that would otherwise be impossible.

In the way Avatar brought a larger-than-like alien species to life to fight and act in ways hand-made animation would be impossible to replicate, using motion capture to bring complex idol dances to life with full choreography in-game embodies a trend of fully-virtual stars and virtual characters that embody a real, breathing person, such as VTubers.

(Image credit: Cygames)

At Cygames, motion capture is such a core of its production process because whether bringing complex action sequence or cute pop performances to life, animating human actions and emotions is key to building the connection to their characters that ensures they’ll support the games further through merchandising or in multimedia expansion like movies, anime and more.

Motion capture is about the person in the suit and the actions that make beloved game characters feel real and almost human, whether they be buff soldiers in a fantasy world or an idol singer. The technology is rapidly changing, but the artistry behind making it work remains.

Visit Cygames' website for the latest news on its new and upcoming game releases for PS5, XBox, PC and mobile.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.