Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Creative Bloq
Creative Bloq
Technology
Paul Hatton

Move AI's markerless mocap tech is dominating everything, from Grimes to EA Sports games

Move Live.

It was a long time coming, but Move AI launching markerless motion capture was well worth the wait. I covered Move AI during launch and was impressed with this AI motion-capture app. Eight months after its launch this smart AI tool has been developed into from an offline markerless motion capture technology and into a real-time mocap tool, and it's dominating, well… everything.

All of this has caused quite a stir within the world of TV and film. For the first time, it is possible to create real-time 3D digital experiences in virtual production, sports, gaming, retail events, broadcast TV, and more.

The past year has been busy at Move. After years of R&D, they launched Move One single-camera motion capture, Move Pro multi-camera motion capture, and Move Live real-time motion capture and post-processing.

Move Live has already been utilised by OMM to create Nike store live marketing events in London, Paris, and Berlin for UEFA Euro 2024. Fans can be scanned live, in real time and take part in digitised, virtual sports activities.

In the middle of all the hype, I caught up with Brian Bosché, VP of revenue and marketing at Move AI. He has more than 12 years experience in creative technology and was the CEO and co-founder of Slope, a content collaboration tool for marketing and creative teams.

What is the most interesting project that you've heard about that has utilised Move technology?

Brian Bosché: Move has been used by thousands of studios and brands across industries, but EA using Move AI is one of the most interesting because of how far they push the system. It’s a compelling challenge to capture the authentic natural movement of athletes playing in real-game scenarios in large volumes. I also grew up playing EA Sports games, so it’s near and dear to my heart.

One of the newer customers we are excited to work with is Improbable. They are using Move Live to host live events in the metaverse. It’s amazing to see how real-time markerless motion capture can be used for everything from music performances to speaking events to meet-and-greets.

How did the collaboration with OMM on the Nike EC24 digital experience come about?

BB: We've known Nike and the team at OMM for a while and admire their innovative approach to creating digital experiences. Together, we have been exploring use cases for brand activations using Move Live.

Fortunately, when we mentioned to the OMM team that we were launching Move Live and seeking market partners, they had a brief for Nike and were looking for the technology to deliver it.

Did OMM just take your technology or was it more of a hands on effort from MOVE AI?

It's been a partnership in the truest sense of the word. OMM handles the end client, the creative work, and the in-store installation, while the Move team provides background support, including in-person assistance, software support, and on-site support for the installation.

What technical challenges were there to setting up motion capture in such public spaces?

BB: To track humans using computer vision and AI with any degree of usable quality, a clear line of sight of the entire body is necessary for extracting 3D motion data. The in-store activation was challenging due to the small tracking area and the cameras being rigged at an acute angle to those being tracked. It was live and active across three sites for a month without any support incidents, highlighting the software's robustness.

(Image credit: Move AI / Racquet Studios)

What were your reflections on the back of XiteLabs using Move AI to create real time previs and motion capture of Grimes performing?

BB: The Grimes project came in at the last minute. Thankfully, when we called our partner studio, Metalvx in LA, their team was able to mobilise quickly to execute on the project.

The benefits of using Move Live were twofold: firstly, Grimes could walk into the capture volume seamlessly and immediately see her own Grimes AI avatar performing her exact movements. This allowed her to maintain her freedom of expression without being hindered by technology or time constraints.

Secondly, some movements, such as floor interactions, are tricky for motion capture, so the team knew where cleanup would be required, saving time and money by focusing on these areas. Ultimately, the data was reprocessed using our second solve feature in Move Live, bringing it to production quality for the XR broadcast.

There is a huge opportunity for studios and brands to use Move Live to create these engaging 3D digital performances.

What are some of the more common industries that Move AI has been used in?

BB: Move AI serves many industries, but our current focus is on media and entertainment. We work with thousands of customers, and here are a few specific examples of how Move AI is used in gaming, TV, music, advertising, and more.

  • Ubisoft: Move AI was used to capture complex dance movements for the 3D animations in Just Dance 2023
  • DR Ultra TV show "Alt er Tilladt": Move AI and Unreal Engine were used to create VFX shots for Season 1 of this Danish show.
  • Eye Garden: Collaborated with director Jordan Fish on MGMT's “Mother Nature” music video, using Move AI.
  • Artificial Rome: Created a digital twin of a soccer match at Wembley Stadium for the SOIL metaverse platform, capturing 22 footballers during a real-life match.
  • MUVA House and Publicis Groupe: Used Move to create high-quality animations for a Heineken campaign, delivering the Heineken Game in just two weeks.

I'm excited to see customers in healthcare, retail, consumer applications, and manufacturing use the Move API to capture, analyse, and generate human motion data for sophisticated 3D applications and experiences.

(Image credit: Move AI / Racquet Studios)

What advancements would you like to see in the area of motion capture in the coming years?

BB: We envision significant improvements in the accuracy and precision of markerless motion capture systems. Our goal at Move is to develop solutions that rival, if not surpass, the precision of optical (marker-based) systems and to achieve such accuracy with real-time processing.

This presents a huge opportunity to expand human motion capture across multiple industries. Incorporating AI and machine learning can revolutionise these systems by enhancing predictive modelling, automatic error correction, and generating realistic animations from motion data that can adapt to various environments and subjects.

What are some of the challenges that need overcoming before the above advancements can be achieved?

BB: These are some of the top challenges we see as we make advancements in markerless motion capture technology:

  • Camera technology is constantly improving. Being able to process 4K and 8K footage at high frame rate and in real time will be a huge hardware challenge.
  • Consumer-grade technology needs further development to boost the value of motion data. Creating widely adopted hardware to consume 3D digital experiences will open up endless use cases.
  • Educating customers on how they can use this technology to enhance their products and operational workflows. Capturing, consuming, and using 3D data is not trivial and requires a big shift across different industries.

Interested in mocap? Then read our breakdown of how Move AI works, our tutorial on how to animate a character using mocap in Unreal Engine 5, and pick up one of the best powerful laptops.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.