DMR News

Advancing Digital Conversations

European Union Initiates Major Investigation into X’s Moderation Practice

ByHuey Yee Ong

May 10, 2024

European Union Initiates Major Investigation into X’s Moderation Practice

The European Union has launched a comprehensive investigation into X, formerly known as Twitter, under the Digital Services Act (DSA), a significant regulatory framework aimed at curbing illegal and harmful online content.

This marks the first major probe into the company since the enactment of the DSA, with the European Commission, the executive arm of the EU, spearheading this initiative.

On Wednesday, May 8th, the Commission disclosed that it had formally requested information from X concerning its content moderation policies and practices. This request follows the revelation from X’s transparency report in March 2024, which indicated a reduction of nearly 20% in its content moderation team compared to figures reported in an earlier October 2023 transparency report. This downsizing has raised alarms about the platform’s capacity to manage content effectively under the stringent requirements of the DSA.

Examining the Reduction in Moderation Resources

The Commission’s concerns are not limited to staff reductions. It was also noted that X has decreased its linguistic coverage within the EU, cutting down from 11 languages to only 7. This reduction could potentially impact the platform’s ability to monitor and control content in various EU languages effectively.

Moreover, the EU body is seeking additional details from X regarding its risk assessments and the specific mitigation measures it has implemented, particularly concerning the impact of generative artificial intelligence on electoral processes, the spread of illegal content, and the safeguarding of fundamental rights. This inquiry extends to understanding how X plans to handle challenges posed by advanced AI technologies in content management.

The request for information includes a deadline: X must submit the required details about its content moderation resources and its approach to generative AI by May 17. The company is further obligated to respond to all remaining inquiries from the Commission by no later than May 27.

This probe is part of a broader formal infringement process initiated by the Commission in December of the previous year, following concerns regarding X’s approach during the Israel-Hamas conflict. The investigation aims to assess X’s adherence to its obligations under the DSA to counteract the dissemination of illegal content, combat information manipulation, and enhance transparency within the EU.

The Implications of Non-Compliance Under the DSA

As part of the ongoing investigation, EU officials have emphasized that the information request will build on evidence previously gathered, which includes X’s March transparency report and earlier responses related to its strategies for addressing disinformation risks linked to generative AI.

The DSA, which came into force in November 2022, mandates large online platforms like X to diligently mitigate the risks of disinformation and remove hate speech, while also balancing these measures with freedom of expression concerns. Should companies fail to comply with these stringent regulations, they could face penalties amounting to as much as 6% of their global annual revenues.


Related News:


Featured Image courtesy of Matt Cardy/Getty Images

Huey Yee Ong

Hello, from one tech geek to another. Not your beloved TechCrunch writer, but a writer with an avid interest in the fast-paced tech scenes and all the latest tech mojo. I bring with me a unique take towards tech with a honed applied psychology perspective to make tech news digestible. In other words, I deliver tech news that is easy to read.

Leave a Reply

Your email address will not be published. Required fields are marked *