EU to Investigate Elon Musk’s X for Misinformation, Lack of Transparency Under New Digital Services Act

The platform said it is “cooperating with the regulatory process” and asked that it remains “free of political influence and follows the law”

Thierry Breton, Elon Musk
Getty Images

The European Union has launched a formal investigation into Elon Musk’s X to determine whether the platform may have breached the Digital Services Act (DSA) in areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers.

The inquiry will focus on X’s compliance with DSA obligations related to countering the dissemination of illegal content in the EU; the effectiveness of its measures taken to combat information manipulation, such as Community Notes; the measures taken by X to increase transparency of its platform; and a “suspected deceptive design of the user interface” in relation to the blue checkmarks linked to certain subscription products.

“Today’s opening of formal proceedings against X makes it clear that, with the DSA, the time of big online platforms behaving like they are “too big to care” has come to an end,” European Commissioner for the Internal Market Thierry Breton said in a statement. “We now have clear rules, ex ante obligations, strong oversight, speedy enforcement, and deterrent sanctions and we will make full use of our toolbox to protect our citizens and democracies.”

The EU’s decision follows a preliminary investigation, which included an analysis of a risk assessment report submitted by X in September, its transparency report published in November, and its replies to a formal request for information, which, among others, concerned the dissemination of illegal content in the context of Hamas’ terrorist attacks against Israel.

Under the DSA, X is classified as a Very Large Online Platform (VLOP), in which it must “diligently identify, analyse, and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, or from the use made of their services.”

VLOPs are also required to notify individuals or entities of content moderation decisions in a “timely, diligent, non-arbitrary and objective manner,” providing information on “the possibilities for redress in respect of that decision.”

Additionally, they cannot design, organize or operate their online interfaces in a way that “deceives or manipulates their users or in a way that otherwise materially distorts or impairs the ability of the users of their service to make free and informed decisions.”

They also must compile advertisements from the platform and make them publicly available until one year after they were presented through a searchable and reliable tool and provide researchers with “effective access” to platform data.

In a statement, X said that it “remains committed to complying with the Digital Services Act and is cooperating with the regulatory process.”

“It is important that this process remains free of political influence and follows the law,” the statement continued. “X is focused on creating a safe and inclusive environment for all users on our platform, while protecting freedom of expression, and we will continue to work tirelessly towards this goal.”

Moving forward, the commission will continue to gather evidence through additional requests for information, conducting interviews or inspections.

“The opening of formal proceedings empowers the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions,” the EU added. “The Commission is also empowered to accept any commitment made by X to remedy on the matters subject to the proceeding.”

The DSA does not set a legal deadline for the duration of the investigation. Factors include the complexity of the case, the extent to which the company concerned cooperate with the Commission and the exercise of the rights of defense.

Comments