Elon Musk-led X, formerly known as Twitter, is gearing up to build a new moderation office in Austin, Texas, primarily focused on tackling content depicting child sexual abuse.
The head of business operations at X, Joe Benarroch, confirmed that the Austin-based "Trust and Safety center of Excellence" will initially hire 100 full-time content moderators, according to a report by Bloomberg.
The group will reportedly spare no effort to weed out material related to child sexual exploitation (CSE) while playing a key role in enforcing X's other rules such as restricting hate speech and violent posts, the top executive explained.
Regrettably, Benarroch did not reveal when this new centre will be operational.
X is tightening its grip on violations of the platform's rules
You'd need to be at least 13 years old to open an account on the social media platform, which claims less than 1 per cent of its daily users are between the ages of 13-17. Advertisers will not be able to target those 17 and under.
It is worth noting that the move comes after the Australian eSafety Commissioner slammed X for making deep cuts to safety and public policy personnel earlier this month. Moreover, Musk changed some policies last year to ensure the company protects free speech as far as the law allows.
X CEO Linda Yaccarino, who will be testifying before Congress for a hearing on child safety online, shared a link to a blog post that highlights the company's efforts to restrict such materials.
"@X is an entirely new company, and over the past 14 months we've strengthened all our policies and enforcement to prevent bad actors from distributing or engaging with child sexual exploitation content," she wrote.
The top executive noted that the company is "determined to make X inhospitable for actors who seek to exploit minors". Yaccarino will be joined by the CEOs of other major tech companies like Meta Platforms Inc., Discord, TikTok and Snap Inc.
Last week, Yaccarino met elected officials from both parties to discuss topics like artificial intelligence, disinformation, content moderation and child protection.
"We wanted to help Senators/staff understand how X is a new company and what has evolved over the past 14 months. Because as it relates to CSE in particular -- there had been a lot ignored before acquisition that X has changed," Benarroch said.
Benarroch went on to mention that X currently has over 2,000 content moderators, including full-time employees and contractors.
Last month, the EU launched formal infringement proceedings against X under a law that combats disinformation and hate after finding posts related to Hamas's October 7 attack in Israel. Likewise, the Center for Countering Digital Hate (CCDH) accused the platform of failing to moderate hate speech.