Brussels Set to Charge Meta Under Digital Services Act for Content Moderation Failures
European Commission preparing preliminary findings that Facebook and Instagram lack adequate systems for removing "harmful" content—Meta faces potential fines up to 6% of global revenue
September 30, 2025
The European Union is preparing to escalate its regulatory confrontation with Meta Platforms, readying formal charges that accuse the tech giant of failing to adequately police content on Facebook and Instagram under the bloc's sweeping Digital Services Act (DSA).
According to sources familiar with the matter, the European Commission plans to issue preliminary findings in the coming weeks stating that Meta's platforms lack a sufficient "notice and action mechanism"—the system that allows users to flag illegal or harmful content for removal. The move represents a significant escalation of an investigation launched in April 2024 and could result in fines reaching up to 6% of Meta's worldwide annual revenue.
For a company that generated $134 billion in revenue in 2024, such a penalty could theoretically reach over $8 billion, though Meta will have an opportunity to respond to the allegations and propose remedies before any final decision is made.
The Digital Services Act: Europe's Content Governance Framework
The Digital Services Act, which became fully applicable in February 2024, represents one of the most ambitious attempts by any government to regulate online speech and content moderation. The law applies to all online platforms operating in the European Union, but imposes the strictest requirements on "Very Large Online Platforms" (VLOPs)—services with more than 45 million monthly active users in the EU.
Under the DSA, these platforms must implement robust systems to identify and remove what European regulators define as "illegal" or "harmful" content. This broad definition includes illegal hate speech, misinformation, content that could harm minors, and material that poses risks to "civic discourse" and "electoral processes."
The law gives the European Commission sweeping authority to determine what constitutes adequate content moderation and to punish companies deemed non-compliant. Platforms must conduct regular risk assessments, maintain transparent content moderation policies, provide users with clear explanations when content is removed, and offer appeal mechanisms for disputed decisions.
Critics argue that the DSA effectively outsources censorship decisions to private companies, forcing them to err on the side of removing more content to avoid massive fines—a dynamic that chills legitimate speech even as regulators claim to be protecting democratic values.
Meta's Multiple Investigations
Meta currently faces two separate investigations under the DSA. The first, launched in April 2024, examines the company's handling of disinformation, illegal content, and political speech. The second investigation, opened in May 2024, focuses on the protection of minors on Facebook and Instagram.
The Commission has raised several specific concerns about Meta's practices:
Content Moderation Systems: Regulators allege that Meta's "notice and action mechanism" does not adequately enable users to report illegal content. The Commission argues that the system must be easily accessible, efficient in processing reports, and transparent in explaining decisions—requirements that Meta allegedly fails to meet.
Deceptive Advertising and Disinformation: The Commission suspects Meta is not doing enough to counter coordinated disinformation campaigns and inauthentic behavior, particularly around sensitive topics like elections. European officials argue that the proliferation of such content presents risks to "civic discourse, electoral processes, and fundamental rights."
Political Content Visibility: Meta's decision to reduce the visibility of political content in users' feeds on both Facebook and Instagram has drawn scrutiny. The Commission questions whether this policy complies with transparency requirements and whether it inappropriately limits civic discourse.
Real-Time Monitoring Tools: One particularly contentious issue involves Meta's deprecation of CrowdTangle, a public insights tool that enabled researchers, journalists, and civil society organizations to monitor content trends in real-time. The Commission argues that Meta eliminated this tool without providing an adequate replacement, making it harder to track misinformation and electoral interference ahead of EU elections.
Protection of Minors: In a separate investigation, the Commission has raised concerns that Facebook and Instagram's algorithms may "stimulate behavioral addictions in children" and create "rabbit-hole effects" that expose minors to harmful content. Regulators are also questioning Meta's age verification methods.
The Broader Pattern of EU Tech Enforcement
Meta is far from alone in facing DSA scrutiny. The European Commission has opened formal proceedings against multiple major platforms:
- X (formerly Twitter): Received preliminary findings of DSA violations in July 2024, making it the first platform to reach this stage
- TikTok: Under investigation for addictive design features and concerns about minor protection
- AliExpress: Facing charges over inadequate measures against illegal products
- Temu: Under scrutiny for risks associated with illegal product sales
- Multiple pornography platforms: Recently designated as VLOPs and subject to DSA requirements
Despite these numerous investigations, the Commission has not yet issued any actual fines under the DSA. However, the regulatory pressure has already forced platforms to make significant changes to their operations within the EU.
Meta's Defense and American Pushback
Meta has consistently rejected allegations that it violated the DSA. Spokesperson Ben Walters stated that the company "disagreed with any suggestion we have breached the DSA" and confirmed ongoing negotiations with EU officials.
The company has implemented numerous changes in response to European regulatory demands, including:
- Offering users options to see less personalized content
- Introducing subscription models that allow ad-free experiences
- Implementing stricter controls on advertising to minors
- Enhancing transparency in content moderation decisions
However, Meta has also pushed back aggressively against what it views as discriminatory enforcement. When the Commission fined Meta €200 million in April 2025 under a separate regulation (the Digital Markets Act), the company's Chief Global Affairs Officer Joel Kaplan issued a scathing statement:
"The European Commission is attempting to handicap successful American businesses while allowing Chinese and European companies to operate under different standards. This isn't just about a fine; the Commission forcing us to change our business model effectively imposes a multi-billion-dollar tariff on Meta while requiring us to offer an inferior service."
This sentiment has found support from U.S. political leaders. President Donald Trump has repeatedly criticized both the DSA and the Digital Markets Act, arguing they unfairly target American companies and erode free expression. The transatlantic tension over digital regulation has emerged as a significant point of friction in U.S.-EU relations.
The Free Speech Debate
The DSA has sparked intense debate about the proper role of government in regulating online speech. Supporters argue the law protects citizens from harmful content, prevents the spread of misinformation, and holds powerful platforms accountable. The European Commission maintains that the DSA preserves fundamental rights while creating a safer digital environment.
Critics, however, contend that the DSA represents an unprecedented expansion of government control over online expression. By defining vague categories like "harmful content" and threatening massive fines for non-compliance, the law effectively compels platforms to over-moderate and remove content that might be entirely legal.
Civil liberties advocates have raised several concerns:
Vague Standards: Terms like "harmful content," "misinformation," and threats to "civic discourse" lack precise definitions, giving regulators broad discretion to pressure platforms over content they disfavor.
Prior Restraint: By forcing platforms to proactively identify and remove content before it spreads, the DSA creates a system of prior restraint that would be unconstitutional in many democracies, including the United States.
Chilling Effect: Faced with potential fines in the billions, platforms have strong incentives to remove borderline content rather than risk regulatory action. This dynamic inevitably suppresses legitimate speech.
Political Control: Critics worry that government officials can use the threat of DSA enforcement to pressure platforms over politically controversial content, effectively giving authorities veto power over online debate.
Information Access: The elimination of tools like CrowdTangle—which Meta removed partly in response to regulatory pressure—makes it harder for independent researchers and journalists to scrutinize platform operations, reducing transparency despite the DSA's stated goals.
The Compliance Cost Burden
Beyond the potential fines, the DSA imposes enormous compliance costs on platforms. Industry estimates suggest that major tech companies spend hundreds of millions of dollars annually just to meet DSA requirements—costs that multiply across numerous regulatory frameworks.
These expenses include:
- Hiring thousands of content moderators with expertise in European languages and laws
- Developing sophisticated systems to detect and categorize potentially illegal content
- Conducting regular risk assessments and independent audits
- Creating appeals processes and out-of-court dispute settlement mechanisms
- Providing detailed transparency reports every six months
- Paying supervisory fees to fund EU oversight (Meta and TikTok are currently challenging these fees in court, arguing the calculation methodology is opaque and discriminatory)
The Computer & Communications Industry Association estimates that EU digital regulations cost U.S. tech firms billions annually in compliance expenses, effectively functioning as a tax on American innovation.
What Happens Next
The European Commission's preliminary findings represent the next phase in the formal proceedings against Meta. Once issued, Meta will have an opportunity to respond to the specific allegations and propose commitments to address the Commission's concerns.
If the Commission ultimately finds Meta in violation of the DSA, it could:
- Issue a fine of up to 6% of global annual revenue
- Require specific changes to Meta's content moderation systems
- Impose periodic penalty payments if Meta fails to comply within specified deadlines
- In extreme cases, potentially restrict Meta's operations in the EU
Meta would have the right to challenge any adverse decision in EU courts, a process that could take years to resolve. The company has already demonstrated its willingness to contest EU regulatory actions, as evidenced by its ongoing legal challenge to supervisory fees and its pushback against previous enforcement actions.
Broader Implications
The Meta investigation reflects broader questions about the future of internet governance and the balance between platform accountability and free expression. The DSA represents a fundamentally different approach from American legal traditions, which generally grant broad protections to online speech and limit government's ability to compel content removal.
European officials argue their model better protects citizens from real harms—from terrorist content to child exploitation to election interference. They maintain that massive platforms wield enormous power over public discourse and must be held accountable when they fail to prevent serious harms.
American critics counter that the European approach grants government authorities dangerous control over online expression and sets a troubling precedent that authoritarian regimes could exploit. They warn that as more countries adopt DSA-style regulations, the global internet could fragment into regional systems with dramatically different rules for acceptable speech.
The outcome of Meta's case could influence how other major platforms approach content moderation in Europe and whether they choose to maintain different standards across jurisdictions. It may also affect the willingness of smaller platforms to operate in the EU market, given the enormous compliance burdens and financial risks.
The Transatlantic Divide
The DSA controversy highlights a widening transatlantic split on digital governance. While European regulators have embraced an aggressive approach to platform oversight—backed by potentially massive fines—American policymakers have largely resisted similar interventions, citing First Amendment concerns and fears of government overreach.
This divergence creates significant challenges for global platforms that must navigate fundamentally incompatible regulatory frameworks. Content that European authorities deem "harmful" and require removal might be constitutionally protected speech in the United States. Transparency requirements that satisfy Brussels might conflict with American privacy laws or intellectual property protections.
The Trump administration has signaled it may retaliate against what it views as discriminatory European enforcement, potentially escalating tensions into a broader trade dispute. President Trump has called for reciprocal measures against European companies and suggested the EU's tech regulations violate international trade commitments.
Conclusion
As the European Commission prepares formal charges against Meta over content moderation practices, the case promises to be a defining test of the Digital Services Act's scope and legitimacy. With potential fines in the billions and fundamental questions about speech regulation at stake, the outcome will reverberate far beyond Meta's platforms.
For Meta, the investigation represents another chapter in an increasingly contentious relationship with European regulators. For the EU, it's an opportunity to demonstrate that its ambitious regulatory framework has teeth and that even the world's largest tech companies must answer to European authority.
For internet users—both in Europe and globally—the case raises profound questions about who should decide what content is acceptable online and what role government should play in regulating digital speech. As these proceedings unfold, they will help define the future of the internet and the balance between safety and freedom in the digital age.
The coming months will reveal whether Meta can satisfy Brussels' demands without fundamentally compromising its business model, whether the EU's approach to content governance will withstand legal and political challenges, and whether the transatlantic alliance can bridge its growing divide over digital regulation.
One thing seems certain: the battle over who controls online speech is far from over, and Meta's case is just the latest skirmish in what promises to be a long war.
The European Commission is expected to issue its preliminary findings against Meta within the coming weeks. Meta will then have an opportunity to respond before any final enforcement decision is made.