EU Accuses Meta of Violating Digital Rules Regarding Underage Users

10

The European Commission has issued a preliminary finding against Meta, alleging that the company has failed to adequately prevent children under the age of 13 from accessing Instagram and Facebook. This move signals a significant escalation in the EU’s effort to enforce the Digital Services Act (DSA), a landmark piece of legislation designed to hold tech giants accountable for online safety.

The Core of the Allegation: Ineffective Age Verification

While Meta’s own terms of service mandate a minimum age of 13, the Commission argues that the company’s enforcement mechanisms are fundamentally flawed. The primary issue lies in the ease of bypassing these rules:

  • False Information: Children can simply enter a false date of birth during the sign-up process.
  • Lack of Verification: Currently, there is no robust mechanism in place to verify whether the age provided by a user is actually accurate.
  • Discrepancies in Data: The Commission estimates that 10–12% of users on Instagram and Facebook are under the age of 13, a figure that contradicts Meta’s own internal assessments.

Furthermore, the Commission noted that Meta has seemingly “disregarded readily available scientific evidence” regarding the specific vulnerabilities of younger children to the harms associated with these social media platforms.

Meta’s Defense: An “Industry-Wide Challenge”

Meta has formally disagreed with these preliminary findings. In a statement provided to Euronews, a spokesperson for the company emphasized that their platforms are intended for users 13 and older and that they are actively investing in technology to detect and remove underage accounts.

“Understanding age is an industry-wide challenge, which requires an industry-wide solution,” Meta stated, signaling their intent to continue working with regulators while defending their current efforts.

The company also teased that more information regarding “additional measures” will be released next week, suggesting that new technical solutions may be on the horizon.

The Broader Context: A Push for Stricter Controls

This investigation is not an isolated event; it is part of a growing movement across Europe to tighten digital safety for minors. Several EU member states are currently debating blanket social media bans for children under 15.

However, the transition from policy to practice faces a major technical hurdle: how to verify age without compromising user privacy. To address this, European Commission President Ursula von der Leyen recently announced that a dedicated EU age-verification app is technically ready for rollout, though no specific launch date has been set.

What is at Stake?

The legal process is far from over. Meta now has the opportunity to review the Commission’s investigation files and submit a written response.

If the Commission’s findings are finalized and Meta is found to be in non-compliance, the consequences will be severe. The company could face formal sanctions and fines of up to 6% of its total worldwide annual turnover —a penalty that could reach into the billions of euros.


Conclusion: This investigation highlights a critical tension between social media accessibility and child safety, setting the stage for a massive regulatory showdown that could redefine how age is verified across the entire internet.