Meta faces EU investigation over Russian disinformation

114

The European Commission has initiated a formal investigation into Meta regarding its management of political content, focusing on a possible Russian influence campaign.

As elections approach in the EU and other regions, authorities plan to determine if Meta’s strategies for combating misinformation on Facebook and Instagram violate EU regulations.

The Commission is particularly worried about how Meta supervises its advertising tools and whether these have been misused by “malicious actors.”

Additionally, the investigation will scrutinize Meta’s transparency in moderating political content and accounts.

“We have a well-established process for identifying and mitigating risks on our platforms,” Meta said in a statement.

“We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

The company is one of several tech firms designated “very large online platforms” (VLOPs) under the bloc’s Digital Services Act (DSA).

VLOPs could incur penalties up to 6% of their annual revenue if they fail to comply with stricter content moderation standards. These standards require proactive measures to prevent election manipulation and the spread of misinformation.

The Commission has expressed concerns that Meta’s existing procedures for moderating disinformation and political advertisements might not meet the requirements of the DSA.

There is also worry about the potential effects on the forthcoming electoral cycle, especially with the European Parliament elections scheduled for June.

“This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries,” said Commission President Ursula von der Leyen.

“If we suspect a violation of the rules, we act. This is true at all times, but especially in times of democratic elections.”

The four concerns at the heart of the EU Commission’s investigation are:

  • Ineffective oversight and moderation of adverts
  • Lack of transparency over demotion of political content and accounts
  • Journalists and civil society researchers having no easy access to real-time data or tools to monitor political content during elections
  • Lack of clear and easy ways for users to report illegal content

A European Commission official stated that Meta’s current method of moderating advertisements does not align with the requirements of the DSA.

The commission referenced a study by the non-profit research organization, AI Forensics, which identified a Russian influence campaign that had been placing advertisements across Meta’s platforms.

AI Forensics reported discovering a network of 3,826 pages disseminating “pro-Russian propaganda,” reaching 38 million users from August 2023 to March 2024. According to their findings, less than 20% of these ads were flagged by Meta as political.

Meta has acknowledged taking measures against the “Doppelganger” campaign since initially uncovering it in 2022 and noted a decrease in user engagement with the campaign.

The Commission has requested that Meta provide information within five days regarding tools available for journalists and researchers to monitor content on Facebook and Instagram during the upcoming elections.

Concerns were also raised about Meta’s management of CrowdTangle, a public tool that offers data and insights into content engagement on Facebook and Instagram. Despite Meta’s announcement in March that CrowdTangle would be discontinued on August 14, the company stated that it is developing new tools to offer broader access to platform data.

This investigation by the EU Commission comes after a similar inquiry into disinformation practices on X (formerly Twitter) launched in March.