The European Commission has initiated an investigation into Meta Platforms’ Facebook and Instagram, citing concerns over their failure to combat deceptive advertising and disinformation.
Amid worries about external sources like Russia, China, and Iran, as well as internal political parties spreading lies to garner votes, EU regulators are scrutinizing Meta’s adherence to EU online content regulations.
The Digital Services Act mandates stricter measures for Big Tech to counter illegal content, with potential fines of up to 6% of their global annual turnover.
“We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures,” EU digital chief Margrethe Vestager said in a statement.
“So today, we have opened proceedings against Meta to assess their compliance with the Digital Services Act,” she said.
Meta, which has more than 250 million monthly active users in the European Union, defended its risk mitigating process.
“We have a well established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work,” a Meta spokesperson said.
Despite Meta’s defense of its risk mitigation processes, the Commission believes Meta may be falling short in addressing deceptive advertisements, disinformation campaigns, and coordinated inauthentic behavior.Additionally, the absence of an effective third-party real-time civic discourse and election-monitoring tool ahead of the European Parliament elections has raised further concerns. Meta has been given five working days to outline remedial actions to address these concerns.