New Rules Target Social Media Algorithms

New York Moves to Restrict Algorithmic Feeds for Children on Social Media

The420.in Staff
3 Min Read

State regulators have released draft rules aimed at limiting how social media platforms operate for children under 18. The proposal would ban algorithmic, personalized feeds for minors unless parents explicitly grant approval, marking one of the strictest state-level interventions in the U.S.

The rules give platforms 180 days to comply and require age verification mechanisms to ensure enforcement. Officials say the measures are designed to reduce children’s exposure to addictive and harmful content.

India to Honour Top CISOs from Police, Law Enforcement, and Defence Forces

Building on SAFE for Kids Act

The draft rules build on the Stop Addictive Feeds Exploitation (SAFE) for Kids Act, passed last year. That legislation already required parental approval before children could access algorithm-driven feeds. The new framework strengthens enforcement by mandating deadlines and verification systems.

Lawmakers argue that platforms’ reliance on engagement-driven algorithms has harmful effects, particularly for young users. “This is about putting parents back in control of their children’s online lives,” one legislator said, emphasizing that parental oversight must be central to digital safety.

Concerns Over Mental Health

Experts point out that algorithmic feeds are designed to maximize time spent on apps, often by promoting sensational or emotionally charged content. Studies link such mechanisms to increased rates of anxiety, depression, and sleep disruption among minors. Regulators argue that breaking the cycle of addictive scrolling is essential for safeguarding youth mental health.

Parents and child advocacy groups have welcomed the move, though tech companies are expected to push back, citing concerns about free speech, operational challenges, and the risk of driving minors to unregulated platforms.

Part of a Global Push

New York’s initiative reflects a broader international trend. Australia will soon implement rules requiring all platforms to verify user ages and secure parental consent for minors. In Europe, the Digital Services Act already compels platforms to curb targeted advertising for children.

Analysts say these developments illustrate growing global consensus: protecting children online now outweighs unrestricted platform design. As one expert noted, “This is a turning point in how societies balance digital freedom with child safety.”

Stay Connected