Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Windows Central
Windows Central
Technology
Cole Martin

Call of Duty partners with Modulate to add real-time voice chat moderation to Modern Warfare 3

Call of Duty: Modern Warfare 3 reveal screenshots

What you need to know

  • Call of Duty: Modern Warfare 3 is being developed by Sledgehammer Games and published by Activision, with support from nearly a dozen additional Activision studios.
  • Toxic voice and text chat, hate speech, and harassment have long plagued the blockbuster franchise's online lobbies.
  • Sledgehammer Games and Activision are partnering with Modulate to bring ToxMod—an AI-powered voice chat moderation tool—to the next premium release, Call of Duty: Modern Warfare 3, in November.
  • A beta for ToxMod will begin rolling out to existing Call of Duty titles, including Modern Warfare 2 and Warzone, starting August 30 in North America with worldwide release (excluding Asia) timed with the launch of Modern Warfare 3 on November 10.

Call of Duty and toxicity have had a long, sordid history. The phrase "You wouldn't last in a COD lobby" is even hurled online at people who protest the use of derogatory language and slurs. With the growing popularity of recently released titles paired with the free-to-play availability of Call of Duty: Warzone, however, the team at Activision and the development studios responsible for Call of Duty have made combating the toxicity within the multiplayer space a priority. Activision has announced a new partnership with Modulate which will bring real-time voice chat moderation, at-scale, to Call of Duty servers with the launch of Modern Warfare 3 on November 10.

Call of Duty: Modern Warfare 2 launched with a modernized Code of Conduct that players were expected to read and agree to when they initially launched the game, with a new agreement popping up occasionally after a new season would launch. The code of conduct was difficult to enforce, however, and the new in-game reporting system was often bombarded with false and retaliatory accusations.

Players who reported others for hateful and harassing conduct were also not informed whether their complaints were even received or acted upon. Eventually, Activision and lead developer for Modern Warfare 2, Infinity Ward, were left to announce that efforts to auto-moderate voice and text chat were being shelved because of false complaints.

The fight against discrimination, hate speech, and harassment was far from over for Activision, however. The partnership with Modulate will introduce AI-powered voice chat moderation to Call of Duty via a system known as ToxMod. According to a press release from Activision, ToxMod will be used to bolster the existing moderation efforts of the Call of Duty anti-toxicity team by adding text-based filtering across 14 languages for the in-game text chat and a brand new, more robust in-game player reporting system. Both chat and player usernames will be moderated by ToxMod. 

There’s no place for disruptive behavior or harassment in games, ever. Tackling disruptive voice chat particularly has long been an extraordinary challenge across gaming. With this collaboration, we are now bringing Modulate’s state-of-the-art machine learning technology that can scale in real-time for a global level of enforcement. This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players.

Michael Vance, Activision CTO

ToxMod will begin rolling out in North America as a beta to existing Call of Duty titles, including Call of Duty: Modern Warfare 2 and Call of Duty: Warzone beginning on August 30 as part of the Season 5 Reloaded patch. The full release of ToxMod is scheduled alongside the launch of Call of Duty: Modern Warfare 3 on November 10. The full release is expected to take effect worldwide, but will exclude Asia. Support will begin with English only, but additional languages will be added over time. 

Current enforcement efforts by the Call of Duty anti-toxicity team has resulted in more than 1 million accounts having had their access to voice and text chat restricted for violating the Call of Duty Code of Conduct.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.