Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Inverse
Inverse
Technology
Robin Bea

A 'Valorant' Viral Incident Shows that Online Gaming's Biggest Problem Isn't Going Away

— Riot Games

Toxicity is an unfortunately common part of online gaming, which should be no surprise to anyone who’s spent much time in a multiplayer lobby. But what’s much less common is an incident so bad that it gets a reply from the game in question’s executive producer. After a Valorant streamer shared a particularly upsetting instance of abuse in voice chat this week, garnering 37 million views, it prompted developer Riot Games to speak out about what it’s doing to protect players from similar abuse in the future.

“I am an incredibly strong person and I have been streaming for a very, very long time,” streamer Taylor Morgan said in a social media post earlier this week. “But absolutely nothing prepares you for someone saying this to you.” Attached to her post was a video (which includes disturbing language and should be viewed with caution) showing a heinous example of the kind of toxicity that some players regularly experience in multiplayer games like Valorant. In the clip, Morgan is explicitly threatened with sexual assault. The clip ends with Morgan leaving the game, which she later received a suspension for.

“This has been the top thing on my mind (a lot of our minds) since yesterday,” Valorant executive producer Anna Donlon replied the next day. “It's important to me that we lead with action first, so until we'd actually pressed the right buttons and made some necessary internal changes, I didn't want to tweet out empty condolences when it's on us to do the hard work here.”

The fact that Donlon commented at all is rare, particularly in that she seems to acknowledge that the responsibility for fixing player behavior falls on Riot. But while it may be some reassurance that someone at Riot is listening to players’ concerns, Donlon doesn’t explain what actions have been taken in response, and details on how the studio plans to address toxic behavior in general are far murkier.

“We made a genuine commitment to Valorant’s entire global community that we will do everything we can to foster an inclusive and safe environment for our players,” a Riot Games spokesperson tells Inverse. “In addition to the player reporting tools, automatic detection system, and our Muted Words List, we’re currently beta testing our voice moderation system in North America, enabling Riot to record and evaluate in-game voice comms.”

Gregory is referring to a program Riot announced in 2022, which monitors in-game voice chat to train its automated moderation tools to identify abusive language. Similar systems have already been implemented in games like Fortnite.

However, it’s not clear what effect these moderation systems actually have on player behavior. Recordings of toxic chat can help moderation tools identify when a player has broken a game’s rules, but they don’t necessarily lead to a safer environment. In the comments to Morgan’s post, dozens of players report receiving similar abuse in their own Valorant matches, suggesting that enforcement is still lacking. Riot didn’t immediately respond to questions on whether its voice recording program has actually led to actions against offending players.

The problem isn’t confined to Valorant, either. Surveys of women’s experiences in online games have found widely varied figures. A 2024 survey from The Guardian found that 49 percent of women experienced toxicity in online games, and 80 percent of that took the form of sexual harassment. A 2022 report from Bryter Global found 73 percent of women received abusive comments, with 28 percent reporting sexual harassment. One clear pattern is that women tend to experience more harassment than men, and that it tends to be explicitly sexual, including being sent inappropriate photos and receiving physical threats.

While Donlon insinuates that some action was taken as a result of Morgan sharing her experience, plenty of other players experience similar harassment with no remedy. Several replies to Morgan’s post call for hardware bans for offenders, which would block the PC or console of players who’ve repeatedly harassed others from playing the game even if they made a new account.

“We've actually considered this approach,” Donlon replied. “There are some complexities we're navigating, but it is for sure an option in the future.”

Developers may be slow to outright ban users for fear of fracturing a game’s community, but the problem is too big for them to ignore. A 2023 report from Unity found that 73 percent of players avoid games with known toxic communities, and 67 percent would leave a game where they experienced abuse, though as Morgan’s experience shows, leaving a game can sometimes come with harsher penalties than abusive chat. A 2023 paper published in the journal Computers in Human Behavior Reports found that normalization of toxicity tends to increase its negative effects, so the reluctance to ban players could itself be encouraging abuse.

Donlon’s response is an encouraging sign that at least some game developers do take harassment seriously and genuinely want to change things for the better, but it’s not enough. Riot hasn’t laid out exactly how it deals with abusive chat violations, and beyond Morgan’s case, most victims of abuse in games don’t have a big enough platform to garner a personal response from developers. The gap between what developers say about harassment and what they’re willing to do — or at least publicly disclose — about it is as wide as ever, and no amount of consoling words will prevent it from happening again.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.