Fortnite is rolling out a new tool to help players combat bad behavior in-game, Epic Games announced Thursday. Starting today, voice reporting will be implemented in Fortnite, letting players send audio clips of harassment and other rule violations alongside the existing reporting feature.
The new voice reporting feature isn’t being introduced in response to any particular player behavior, Epic Games’ senior communications manager Jake Jones tells Inverse. The move is instead part of the company’s “ongoing effort to build a safe and fun ecosystem,” Jones says. Epic’s existing safety features include Cabined Accounts, which limit interactions with other players and activities in Epic’s store for players under the age of 13.
Voice reporting will be turned on by default in most cases. Players over 18 years old will have the option of changing their voice reporting settings from “Always On” to “Off When Possible.” Only in party chat when all members have selected “Off When Possible” will voice reporting be disabled. Players under 18 years old won’t have the option to turn voice reporting off even in parties. Voice chats will always include a message showing whether voice reporting is on or off.
In its new announcement, Epic says that for younger players who don’t want their voices recorded, the only option will be to turn voice chat off or mute their microphones entirely.
Fortnite’s current community rules ban bullying, harassment, and hate speech, disclosing others’ personal information, impersonating another player, and promoting illegal activities. Anyone who thinks another player has broken a rule in voice chat now has the option to include audio, which Fortnite records in rolling five-minute clips. When a report is sent, Epic will keep the recording for fourteen days, or for the duration of any suspension that results from it.
When asked how the voice reporting feature would comply with the Children’s Online Privacy Protection Act, which prohibits collecting data from minors under 13 years old without parental consent, Jones, the Epic spokesperson, confirmed that Epic will continue to require parental permission for players under 13 to use voice chat.
As Epic notes in a technical blog post, voice recordings aren’t actually stored on the company’s servers. They’re instead stored on players’ devices and only sent to Epic’s backend system if players choose to send it. The audio will then be analyzed by a moderation team in concert with machine learning tools. The post also goes into detail on how clips from different speakers are tracked and identified, since all audio reports are submitted anonymously.
At least for now, Epic is implementing voice reporting only in Fortnite. The company’s Voice Reporting FAQ does state that it plans to add the feature to other games like Rocket League in the future, but it doesn’t have a timeline for when that could happen. Also on Epic’s roadmap is the ability for players to be notified when voice reports they submit result in any action being taken.
Voice reporting has become a more common part of moderation in games in recent years. Competitive shooter Overwatch 2 began moderating voice chat as part of its Defense Matrix program last year, opting to use transcriptions of chats rather than evaluating the audio itself.
Sony and Microsoft have also launched voice moderation features console-wide, rather than leaving it to individual developers. In all these cases, voice reporting systems work by allowing players to capture and share audio clips with developers or platform holders, not by having audio passively monitored at all times.
Study after study has shown that toxic culture is a problem across online gaming. One study this year, from mental health advocacy org Take This, shows that 70 percent of players have actively avoided certain games because their communities have a reputation for being hostile. Active voice moderation won’t solve the problem entirely, but it could be one way of cracking down on repeat offenders.