France Plans Social Media Ban for Under-15s, Night Curfew for Teens in Online Safety Push

The420.in Staff
6 Min Read

France has joined a growing list of countries moving to tighten controls on children’s access to social media, amid deepening concerns over online safety, mental health and digital addiction. In his New Year’s Eve address, French President Emmanuel Macron indicated that the government was prepared to take tougher steps to protect “children and teenagers from social media and screens.” His remarks have been widely interpreted as backing proposed legislation to restrict social media access for children under the age of 15.

The proposal would place France alongside Australia, where some of the world’s strictest rules on underage social media use are already in force. The French move comes as governments across Europe and other regions reassess how large digital platforms influence children’s behaviour, attention spans and mental well-being.

FCRF Launches Flagship Compliance Certification (GRCP) as India Faces a New Era of Digital Regulation

What does the draft bill propose?

According to French and international media reports, the draft legislation is scheduled to be debated in the French Parliament on January 19, 2026. The bill proposes a complete ban on social media access for children below 15, placing the legal responsibility for age verification squarely on social media companies.

The proposal also introduces a night-time “digital curfew” for teenagers aged 15 to 18, under which social media use could be restricted during late hours. Policymakers argue that excessive screen exposure at night disrupts sleep cycles, affects academic performance and contributes to rising stress and anxiety among adolescents.

In addition, the bill seeks to expand restrictions on mobile phone use in schools. While mobile phones are already banned in primary and middle schools in France, the new legislation could extend similar restrictions to high schools, citing concerns over classroom distraction, cyberbullying and discipline.

Findings behind the push

The proposal follows a six-month parliamentary investigation into the impact of social media on minors. The inquiry reportedly found that platforms such as TikTok were deliberately steering children and teenagers towards addictive and potentially harmful content, driven by engagement-focused algorithms.

Investigators concluded that minors were repeatedly exposed to content related to self-harm, extreme dieting, risky behaviour and distorted body images. These findings intensified calls within France for tougher regulatory intervention, with lawmakers arguing that voluntary safety measures adopted by platforms have failed to adequately protect children.

Global experts sound the alarm

Cybercrime and digital safety experts in several countries, including India, have echoed similar concerns. The Future Crime Research Foundation (FCRF), in its recent studies, has warned that unchecked social media use among children and teenagers could fuel a mental health crisis, increase vulnerability to cybercrime, and expose minors to digital exploitation. According to the foundation, algorithm-driven content delivery is shaping children’s behaviour and decision-making in ways that cannot be effectively addressed through platform-led self-regulation alone.

Why are governments tightening rules?

Governments worldwide argue that the very design of social media platforms promotes excessive screen time, compulsive usage and constant social comparison—factors that negatively affect children’s and adolescents’ mental health.

Experts have cautioned that features such as endless scrolling, disappearing content and frequent notifications erode attention spans, disrupt sleep and heighten stress. Risks linked to cyberbullying, online grooming, financial scams and sexual exploitation have further strengthened the case for stricter regulatory safeguards.

A global trend towards age-based restrictions

France’s proposal follows recent action in Australia, where social media companies have been directed to deactivate accounts of users under 16 and regularly reassess whether their platforms qualify as age-restricted services.

Elsewhere, Spain and Greece are considering mandatory age limits on platforms such as Facebook and X, while Italy and Denmark are testing age-verification systems to block underage access. New Zealand is weighing legislation inspired by Australia’s framework, and Southeast Asian countries including Indonesia and Malaysia have announced plans to strengthen online protections for children and teenagers.

How are social media companies responding?

Major social media companies have rolled out safety features aimed at younger users, but regulators argue these measures remain inconsistent and inadequate. Meta, the owner of Facebook and Instagram, has tightened age checks and limited the recommendation of sensitive content to teens. Snapchat has made teen accounts private by default and restricted contact with unknown users.

TikTok and other platforms have introduced screen-time limits and content filters. However, policymakers in France and elsewhere—along with organisations such as FCRF—maintain that self-regulation has not kept pace with the scale and speed of digital harm, prompting calls for legally binding standards.

As the French Parliament prepares to debate the draft bill later this month, the outcome will be closely watched across Europe. If passed, the legislation could mark a significant policy shift in how democracies balance digital freedoms with child protection in an increasingly online world.

Stay Connected