The European Union has opened a formal investigation into Snapchat, raising questions over age verification, child safety controls and whether minors can be exposed to grooming, manipulative design and illegal or age-restricted products on the platform.

European Union Launches Investigation into Snapchat’s Child Safety Policies

The420 Correspondent
5 Min Read

Brussels: European Union (EU) regulators on Thursday launched an investigation into child protection measures on the social media platform Snapchat, operated by Snap Inc. The move comes amid increasing global scrutiny of social media companies over the safety and accountability of their platforms for younger users.

Officials in Brussels accused Snapchat of having an ineffective age verification system, making it difficult to prevent children under 13 from accessing the platform. Additionally, the company’s algorithm reportedly misclassifies users aged 13 to 17 as adults, exposing them to inappropriate content and experiences.

FCRF Launches Premier CISO Certification Amid Rising Demand for Cybersecurity Leadership

The case emerges amid growing scrutiny of social media firms. Last week, a California jury held Meta and YouTube responsible for harming the mental health of a teenage user through addictive design features, setting a precedent for potential future lawsuits. Earlier this week, a New Mexico jury found Meta liable for violating state laws by failing to protect children from online predators.

Across Europe, several governments are considering new regulations to limit children’s social media use. France, Denmark, and Spain are among countries exploring restrictions for young users, as policymakers increasingly view social media as addictive and harmful to mental health.

Earlier in February, EU regulators issued a preliminary ruling against TikTok for its “addictive design”, which posed potential risks to the physical and mental well-being of users, including minors. Meta is also under investigation since 2024 for child user protection on Instagram and Facebook.

On Thursday, EU regulators accused Snap of failing to adequately protect minors from being contacted by adults posing as children for recruitment into sexual exploitation and criminal activities. The company is being investigated under the Digital Services Act (2022), legislation designed to enforce stricter oversight of online platforms for illicit behavior. Henna Virkkunen, European Commissioner for Digital Policy, said the law “demands high safety standards for all users.”

Snap maintains that users under the age of 13 are not allowed on its platform. However, the European Commission noted that a significant portion of the platform’s 97 million users across the 27-nation bloc fall below this age threshold. In Denmark, half of 10-year-olds use Snapchat. In France, nearly a third of 11-year-olds report being active on the platform.

“Once on the platform, children are exposed to dangerous contacts for grooming and, even worse, can access prohibited products such as drugs, vapes, and alcohol,” said Thomas Regnier, a spokesman for the European Commission, during a Thursday news briefing.

The duration of the investigation remains unclear. Snapchat could face fines of up to 6 percent of its global revenue, though regulators rarely impose penalties of that magnitude.

Snap said it is cooperating fully with European authorities. In a statement, the company said, “Snapchat is designed to help people communicate with close friends and family in a safe and trusted environment, with privacy and safety built in from the start — including additional protections for teens. As online risks evolve, we continuously review, strengthen, and invest in these safeguards.”

The European Union is also developing a “digital wallet” system to provide proof of age without disclosing additional personal information.

In a related case on Thursday, EU regulators issued preliminary rulings against several pornography platforms for failing to prevent minors from accessing their services. Many of these sites currently only require users to click a button confirming they are over 18 before entering restricted content.

Stay Connected