The EU on Wednesday told digital platform X to explain a cut to content moderation resources, amid concerns over disinformation ahead of European elections in June.
The demand is part of the EU's probe into US tech billionaire Elon Musk's X, the former Twitter, that was launched in December under a law clamping down on illegal content online.
The European Union has launched a similar probe into Meta's Facebook and Instagram amid fears they are also doing too little to tackle disinformation.
Brussels is especially worried about the threat of Russian manipulation of voters before the polls on June 6-9.
The European Commission said it wanted more information about X's "content moderation activities and resources" after a transparency report in April showed it has cut its team of content moderators by "almost 20 percent" since an October 2023 report.
X had reduced the moderators' "linguistic coverage within the European Union from 11 EU languages to seven", it added.
It told X to hand over "detailed information and internal documents".
The EU also wants more details about "risk assessments and mitigation measures linked to the impact of generative AI tools on electoral processes", the commission said.
The request appeared to be a warning as Musk rolls out his Grok chatbot, which is available in Australia, Britain, Canada, New Zealand and the United States, but not yet in the EU.
Companies face stricter rules under the EU's mammoth content moderation law known as the Digital Services Act (DSA), including conducting risk assessments before launching a new service in the bloc and implementing measures to mitigate those dangers.
There have already been reports of blunders and misinformation by Grok, which X offers as a tool for summarizing news and other topics on the platform.
So far, the EU has a list of 23 "very large" platforms including X as well as TikTok and YouTube which face greater scrutiny by the commission.
Breaches of the law carry the risk of fines of up to six percent of a company's global revenues. The EU has the power to ban a platform operating in the 27-country bloc for serious and repeated violations.
X must respond to the questions about content moderation and generative AI by May 17.