TikTok Faces Potential Fines After EU Preliminary Findings

EU Targets TikTok’s ‘Addictive Design’, Flags Autoplay and Infinite Scroll Under Digital Rules

The420 Correspondent
6 Min Read

London: The European Union has accused TikTok of violating its sweeping digital regulations, alleging that the video-sharing app’s core design features are intentionally built to keep users hooked, exposing children and vulnerable adults to potential mental and physical health risks.

In preliminary findings released on Friday, the European Commission said TikTok’s use of autoplay, infinite scroll and algorithm-driven content recommendations amounts to “addictive design” — a practice restricted under the bloc’s flagship Digital Services Act (DSA). The law requires large online platforms to identify, assess and mitigate systemic risks arising from their products, including those linked to compulsive use and psychological harm.

Certified Cyber Crime Investigator Course Launched by Centre for Police Technology

EU regulators said their investigation found that TikTok had failed to conduct a comprehensive assessment of how its design choices affect user well-being. According to the Commission, the company did not sufficiently evaluate the impact of endless content feeds that encourage prolonged screen time, nor did it provide convincing evidence of effective safeguards for minors, who make up a significant portion of the app’s user base across Europe.

“The platform’s design may stimulate behavioural patterns that undermine users’ ability to disengage,” the Commission said, warning that such effects are particularly pronounced among children and individuals already susceptible to anxiety, sleep disorders and attention-related problems.

The findings strike at the core of TikTok’s engagement-driven business model, which relies on rapid, personalised video streams to maximise the time users spend on the app. If upheld, the Commission’s conclusions could force the company to significantly alter how content is presented to more than 150 million users across the European Union.

TikTok, owned by China’s ByteDance, strongly pushed back against the accusations. In a statement, the company said the Commission’s conclusions offered a “categorically false and entirely meritless depiction” of its platform. It maintained that user safety has been embedded into its products from the outset and said it has invested heavily in features designed to help users, especially teenagers, manage their screen time.

The company highlighted measures such as default daily screen-time limits for younger users, prompts that encourage breaks after extended viewing sessions, and settings that allow users to disable autoplay. TikTok said it would “take whatever steps are necessary” to challenge the preliminary findings and defend its compliance with EU law.

Under the Digital Services Act, regulators can impose fines of up to 6% of a company’s global annual turnover for serious violations. In extreme cases, authorities also have the power to restrict or suspend access to a service within the EU. While no penalties have been announced at this stage, Friday’s move marks a crucial step in the enforcement process. TikTok will now be given the opportunity to review the evidence, submit a detailed written response and propose corrective measures.

The case reflects a broader and increasingly aggressive push by European authorities to rein in the power of large technology platforms and curb design practices seen as exploitative. Similar concerns have been raised worldwide about so-called “dark patterns” — interface designs that subtly nudge users into behaviours that benefit companies, often at the cost of user well-being.

Mental health experts have long argued that features such as infinite scroll are engineered to exploit psychological vulnerabilities, keeping users engaged far longer than intended. Multiple studies have linked excessive social media use among adolescents to sleep deprivation, reduced concentration and higher rates of anxiety and depression.

The EU’s action also comes amid growing political pressure within member states to strengthen protections for children online. Several governments have tightened national rules on age verification, advertising aimed at minors and data use, while urging Brussels to adopt a tougher stance on platforms with youth-heavy audiences.

For TikTok, which already faces scrutiny in multiple countries over data protection, content moderation and national security concerns, the outcome of the EU probe could have far-reaching global implications. Any design changes mandated in Europe may be difficult to limit to one region, potentially reshaping how the app operates worldwide.

The Commission said its final decision will depend on TikTok’s response and the adequacy of any remedies proposed. Until then, the case sends a clear signal that European regulators are willing to challenge even the most fundamental mechanics of social media platforms in the name of user protection.

About the author — Suvedita Nath is a science student with a growing interest in cybercrime and digital safety. She writes on online activity, cyber threats, and technology-driven risks. Her work focuses on clarity, accuracy, and public awareness.

Stay Connected